Search Results

Search found 6798 results on 272 pages for 'composite controls'.

Page 216/272 | < Previous Page | 212 213 214 215 216 217 218 219 220 221 222 223  | Next Page >

  • 302: this blog will be closed

    - by preishuber
    After nearly 7 years I will discontinue blogging on this site. My resources are limited. You can reach my German blog which is used to support my customers. Looking back to a long an interesting journey ASP.NET by ScottGu That was the reason to attend this site and support Microsoft as much as I can. For that I was honored as ASP.NET MVP- thanks again. Meet Scoot several times. Great guy! Forums I have left NNTP forums a few years ago and now Microsoft closed it- It was my idea ;-) AJAX Was the wrong way- JQuery won the game IIS7 That is really a great plattform and the IIS team rules. I am sad that is so silent around that topic. ASP.NET after 2.0 Is no longer my world. I love ASP.NET and ASP.NET Server controls. I hate the discussion about how to follow the holy rules of MVC. Microsoft have dropped the goal to bring ASP.NET to #1 and accepted PHP is it. Facebook & Twittering Microblogging takes over a part of the blogging business. Shorter faster cheaper- or as SteveB mentioned - do more with less. Google Google is taking over the web. I am using Bing every time as I can but Google have more options. Sorry Microsoft you will loose that game. Apple That is not the biggest problem of Microsoft. the Ixxx takes over a small part but big money of the market, but the customers are not strongly linked. New wave new hype- Game over Apple. Silverlight My new home. I can reuse a lot of my skills and love the possibilitys. Silverligth will passing WPF-and strike Flash Windows phone 7 Also my skills fit. I just will use it for fun. I am not really satisfied about what I have heard from MIX. Guys from Redmond, I am sad to say you have been the best Smartphone OS and lost everything. The ADO vNext Story That will be the next mystic point. WCF, REST, JSON, ATOM and now OData. Nothing about SQL commands. LINQ, ORM is also not the final solution for multilayered disconnected async scenarios. Personally I prefere the OData idea and dislike the Swiss Army Knife (German Eierlegende Wollmilchsau) WCF. I am still in INETA Speakers board and I am glad to come to your user group. In all other cases you can hire me over ppedv AG. Good by and have good live.

    Read the article

  • Network Your Computers & Devices: Step by Step

    - by The Geek
    If you’re looking for a great book to help you learn more about Windows home networking, there’s a new book on the market by our good friend Ciprian, and published by none other than Microsoft Press. Note: our friend Ciprian has been a guest contributor here on How-To Geek in the past, and he’s not only a geek that knows what he’s talking about, he’s also one of the more honest and decent people I’ve worked with. In his spare time, he runs the 7 Tutorials web site. The Book One of the great things about this book is that you aren’t limited to just Windows networking—it also explains how to connect Windows 7, XP, Vista, Mac OS X, and even Linux on the same network and share folders and devices between them. Everything in the book is written in a typical How-To Geek step-by-step format, with plenty of screenshots and pictures to help you through the process. Book Outline If you’re going to be spending some money on the book, you probably want to know what it’s all about, and since the Amazon page doesn’t give, well, much information at all, here’s the entire outline for you: Setting Up a Router and Devices Setting User Account on All Computers Setting Up Your Libraries on All Windows 7 Computers Creating the Network Customizing Network Sharing Settings in Windows 7 Creating the Homegroup and Joining Windows 7 Computers Sharing Libraries and Folders Sharing and Working with Devices Streaming Media Over the Network and the Internet Sharing Between Windows XP, Windows Vista, and Windows 7 Computers Sharing Between Mac OS X and Windows 7 Computers Sharing Between Ubuntu Linux and Windows 7 Computers Keeping the Network Secure Setting Up Parental Controls Troubleshooting Network and Internet Problems It’s a great book, with loads of information, and compared to most tech books isn’t very expensive—only $19.79 for the paperback and $9.99 for the Kindle version. Well worth it, and hey, it’s an official Microsoft Press book—written by a How-To Geek guest author. Network Your Computers & Devices Step by Step [Amazon] Latest Features How-To Geek ETC How To Boot 10 Different Live CDs From 1 USB Flash Drive The 20 Best How-To Geek Linux Articles of 2010 The 50 Best How-To Geek Windows Articles of 2010 The 20 Best How-To Geek Explainer Topics for 2010 How to Disable Caps Lock Key in Windows 7 or Vista How to Use the Avira Rescue CD to Clean Your Infected PC 2011 International Space Station Calendar Available for Download (Free) Ultimate Elimination – Lego Black Ops [Video] BotSync Enables Secure FTP File Synchronization on Android Devices Enjoy Beautiful City Views with the Cityscape Theme for Windows 7 Luigi Installs Any OS on Google’s Cr-48 Notebook DIY iPad Stylus Offers Pen-Based Interaction on the Cheap

    Read the article

  • nautilus selected item color

    - by shantanu
    See in the image, selected item "build" colour is black as background colour. How can i change the selected item colour gtk3 theme's nautilus.css script Which section colour need to modify: /* desktop mode */ .nautilus-desktop.nautilus-canvas-item { color: @bg_color; text-shadow: 1 1 alpha (#001B33, 0.8); } .nautilus-desktop.nautilus-canvas-item:active { background-image: none; background-color: alpha (@selected_bg_color, 0.84); border-radius: 4; color: @fg_color; } .nautilus-desktop.nautilus-canvas-item:selected { background-image: none; background-color: alpha (@bg_color, 0.84); border-radius: 4; color: @selected_fg_color; } .nautilus-desktop.nautilus-canvas-item:active, .nautilus-desktop.nautilus-canvas-item:prelight, .nautilus-desktop.nautilus-canvas-item:selected { text-shadow: none; } /* browser window */ NautilusTrashBar.info, NautilusXContentBar.info, NautilusSearchBar.info, NautilusQueryEditor.info { /* this background-color controls the symbolic icon in the entry */ background-color: mix (@fg_color, @base_color, 0.3); border-radius: 0; border-style: solid; border-width: 0 1 1 1; } NautilusSearchBar .entry { } .nautilus-cluebar-label { color: @fg_color; font: bold; } #nautilus-search-button *:active, #nautilus-search-button *:active:prelight { color: @dark_fg_color; } NautilusFloatingBar { background-color: @info_bg_color; border-radius: 3 3 0 0; border-style: solid; border-width: 1; border-color: darker (@info_bg_color); -unico-border-gradient: none; } NautilusFloatingBar .button { -GtkButton-image-spacing: 0; -GtkButton-inner-border: 0; } /* sidebar */ NautilusWindow .sidebar, NautilusWindow .sidebar .view { background-color: @bg_color; color: @fg_color; } NautilusWindow .sidebar .frame { } NautilusWindow > GtkTable > .pane-separator { background-color: @bg_color; border-color: @bg_color; border-width: 0 0 0 0; border-style: solid; }

    Read the article

  • BizTalk 2009 - SQL Server Job Configuration

    - by StuartBrierley
    Following the installation of Biztalk Server 2009 on my development laptop I used the BizTalk Server Best Practice Analyser which highlighted the fact that two of the SQL Server Agent jobs that BizTalk relies on were not running successfully.  Upon investigation it turned out that these jobs needed to be configured before they would run successfully. To configure these jobs open SQL Server Management Studio, expand SQL Server Agent > Jobs and double click on the appropriate job.  Select Steps and then edit the appropriate entries. Backup BizTalk Server (BizTalkMgmtDb) This job is comprised of three steps BackupFull, MarkAndBackupLog and ClearBackupHistory. BackupFull exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ The frequency here is set/left as daily The name is left as BTS You must provide a full destination path for the backup files to be stored. There are also two optional parameters: A flag that controls if the job forces a full backup if a partial backup fails A parameter to control the time of day to run the full backup; the default is midnight UTC time For example: exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ , 0, 22 MarkAndBackUpLog exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ You must provide a destination path for the log backups. Optionally you can also add an extra parameter that tells the procedure to use local time: exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ ,1 Clear Backup History exec [dbo].[sp_DeleteBackupHistory] @DaysToKeep=7 This will clear out the instances in the MarkLog table older than 7 days.    DTA Purge and Archive (BizTalkDTADb) This job is comprised of a single step. Archive and Purge exec dtasp_BackupAndPurgeTrackingDatabase 0, --@nLiveHours tinyint, 1, --@nLiveDays tinyint = 0, 30, --@nHardDeleteDays tinyint = 0, null, --@nvcFolder nvarchar(1024) = null, null, --@nvcValidatingServer sysname = null, 0 --@fForceBackup int = 0 Any completed instance that is older than the live days plus live hours will be deleted, as will any associated data. Any data older than the HardDeleteDays will be deleted - this means that those long running orchestration instances that would otherwise never be purged will at some point have their data cleared down while allowing the instance to continue, thus preventing the DTA databse from growing indefinitely.  This should always be greater than the soft purge window. The NVC folder is the path for the backup files, if this is null the job will not run failing with the error : DTA Purge and Archive (BizTalkDTADb) Job failed SQL Server Management Studio, job activity monitor, view history The @nvcFolder parameter cannot be null. Archive and Purge step How long you choose to keep instances in the Tracking Database is really up to you. For development I have set this up as: exec dtasp_BackupAndPurgeTrackingDatabase 0, 1, 30, ’<destination path>’, null, 0 On a live server you may want to adjust these figures: exec dtasp_BackupAndPurgeTrackingDatabase 0, 15, 20, ’<destination path>’, null, 0

    Read the article

  • SQL Azure and Trust Services

    - by BuckWoody
    Microsoft is working on a new Windows Azure service called “Trust Services”. Trust Services takes a certificate you upload and uses it to encrypt and decrypt sensitive data in the cloud. Of course, like any security service, there’s a bit more to it than that. I’ll give you a quick overview of how you can use this product to protect data you send to SQL Azure. The primary issue with storing data in the cloud is that you are in an environment that isn’t under your control – in fact, that’s the benefit of being in a distributed computing environment in the first place. On premises you’re able to encrypt data you don’t want anyone else to see, using various methods such as passwords (not very strong) or certificates (stronger). When you use a certificate, it’s vital that you create (or procure) and protect it yourself. When you store data remotely, regardless of IaaS, PaaS or SaaS, you don’t own the machines where the data lives. That means if you use a certificate from the cloud vendor to encrypt the data, you have to trust that the data won’t be accessed by the vendor. In some cases having a signed agreement with the vendor that they won’t access your data is sufficient, in other cases that doesn’t meet the requirements your system has for security. With the new Trust Services service, the basic process is that you use a Portal to create a Trust Server using policies and other controls. You place a X.509 Certificate you create or procure in that server. Using the Software development Kit (SDK), the developer has access to an Application Layer Encryption Framework to set fields of data they want to encrypt. From there, the data can be stored in SQL Azure as a standard field – only it is encrypted before it ever arrives. The portion of the client software that decrypts the data uses the same service, so the authenticated user sees the data if they are allowed to do so. The data remains encrypted “at rest”.  You can learn more about this product and check it out in the SQL Azure labs at Microsoft Codename "Trust Services"

    Read the article

  • One Week To Go: OTN Architect Day: Cloud Computing

    - by Bob Rhubart
    One week remains until OTN Architect Day: Cloud Computing kicks of at the spectacular Oracle HQ campus in Redwood Shores, CA. The event is free, and there is still time to register. When: Tuesday July 9, 2013 8:30am - 12:30pm Where: Oracle Conference Center350 Oracle Pkwy Redwood City, CA 94065 Register now. It's free! Here's the latest update to the event agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Markus Michalewicz Senior Principal Product Manager Oracle Real Application Clusters (RAC) New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Markus Michalewicz respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • One Week To Go: OTN Architect Day: Cloud Computing

    - by Bob Rhubart
    One week remains until OTN Architect Day: Cloud Computing kicks of at the spectacular Oracle HQ campus in Redwood Shores, CA. The event is free, and there is still time to register. When: Tuesday July 9, 2013 8:30am - 12:30pm Where: Oracle Conference Center350 Oracle Pkwy Redwood City, CA 94065 Register now. It's free! Here's the latest update to the event agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Markus Michalewicz Senior Principal Product Manager Oracle Real Application Clusters (RAC) New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Markus Michalewicz respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • Five hours of Task Flow Overview Recordings Available

    - by Frank Nimphius
    In addition to the ADF Controller task flow documentation in Oracle Fusion Middleware Fusion Developer's Guide for Oracle Application Development Framework 11g Release 1 http://download.oracle.com/docs/cd/E21764_01/web.1111/b31974/partpage3.htm#BABHIIAI The ADF Insider website … http://www.oracle.com/technetwork/developer-tools/adf/learnmore/adfinsider-093342.html … hosts five online videos that explain how to build and work with ADF Controller task flows in Oracle ADF. ADF Task Flow - Overview (Part 1) This 90 minute recording introduces the concept of ADF unbounded and bounded task flows, as well as other ADF Controller features. The session starts with an overview of unbounded task flows, bounded task flows and the different activities that exist for developers to build complex application flows. Exception handling and the Train navigation model is also covered in this first part of a two part series. By example of developing a sample application, the recording guides viewers through building unbounded and bounded task flows. This session is continued in a second part. http://download.oracle.com/otn_hosted_doc/jdeveloper/11gdemos/taskflow-overview-p1/taskflow-overview-p1.html ADF Task Flow - Overview (Part 2) This 75 minute session continues where part 1 ended and completes the sample application that guides viewers through different aspects of unbounded and bounded task flow development. In this recording, memory scopes, save for later, task flow opening in dialogs and remote task flow calls are explained and demonstrated. If you are new to ADF Task Flow, then it is recommended to first watch part 1 of this series to be able to follow the explanation guided by the sample application. http://download.oracle.com/otn_hosted_doc/jdeveloper/11gdemos/taskflow-overview-p2/taskflow-overview-p2.html ADF Region Interaction - An Overview This session covers most of the options that exist for communicating between regions. It briefly discusses what it takes to build regions from bounded task flows before going into details using slides and samples. The following interaction is explained: contextual events, queue action in region, input parameters and PPR, drag and drop, shared Data Controls, parent action and region navigation listener. http://download.oracle.com/otn_hosted_doc/jdeveloper/11gdemos/adf-region-interaction/adf-region-interaction.html ADF Region Interaction - Contextual Events Contextual event is used as a communication channel between a parent view and its contained regions, as well as between regions. By example, this session explains how to set up contextual events, how to define producers and event listeners and how to define the payload message. http://download.oracle.com/otn_hosted_doc/jdeveloper/11gdemos/AdfInsiderContextualEvents/AdfInsiderContextualEvents.html

    Read the article

  • Why CoffeeScript is tough to maintain

    - by Renso
    I recently started trying out CoffeeScript only to find out that it caused more headaches. The abstraction level of jQuery was perfect, it did not dictate to coders how to design their code, it just works. However, I recently posted a request to the CoffeeScript team to consider introducing curly braces to help with more complex code to control the flow of logic. For example a if-then-else with many nested levels can be near impossible to debug without tracing through it when using CoffeeScript. Also with IDEs like Visual Studio, regular JavaScript intellicense and auto-formatting make it easy to appropriate indent nested levels without any work on the part of the developer and reading it is not that hard, especially with some extensions that show vertical lines in the code editor to help see what is nested within what part of the code.However with CoffeeScript that is not the case. The samples given in the CoffeeScript web site are of course just simple examples to explain the features and one gets excited pretty quick over the powerful shortcuts. I tried to convert a piece of JavaScript over to CoffeeScript and gave up since you need to first of all remove ALL non CoffeeScript coding constructs for it to even compile. However js2coffee can help with that. However to keep track of nested levels became something that was simply not manageable using CoffeeScript.Furthermore, any coding language that controls the flow of logic by indentation is extremely dangerous for obvious reasons. I liked CoffeeScript a lot, but the fact that the logical flow of the code is controlled by how much you indent code, spaces or tabs, is not reliable as there is no way the programmer has an easy way of knowing what parts of the code will get hit when the code spans a page.When I suggested introducing curly braces in CoffeeScript the team, one contributor advised me that my code needs to be re-designed! Needless to say that is absurd. When I included a piece of the code he asked my if it was legacy code. It's like saying to a Java programmer, sorry you cannot use Java because we don't agree with how you write your code.jashkenas from the CoffeeScript blog gave some great suggestions and made the point that introducing curly braces would be very problematic for them as they use them to denote objects. Makes sense, but I would still love to see some way to replace code flow control with spaces and indentation to something more concrete and human readable.

    Read the article

  • Upgarde from Asp.Net MVC 1 to MVC 2 - how to and issues with JsonRequestBehavior

    - by Renso
    Goal Upgrade your MVC 1 app to MVC 2 Issues You may get errors about your Json data being returned via a GET request violating security principles - we also address this here. This post is not intended to delve into why the Json GET request is or may be an issue, just how to resolve it as part of upgrading from MVC1 to 2. Solution First remove all references from your projects to the MVC 1 dll and replace it with the MVC 2 dll. Now update your web.config file in your web app root folder by simply changing references to assembly="System.Web.Mvc, Version 1.0.0.0 to Version 2.0.0.0, there are a couple of references in your config file, here are probably most of them you may have:         <compilation debug="true" defaultLanguage="c#">             <assemblies>                        <add assembly="System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />             </assemblies>         </compilation>           <pages masterPageFile="~/Views/Masters/CRMTemplate.master" pageParserFilterType="System.Web.Mvc.ViewTypeParserFilter, System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" pageBaseType="System.Web.Mvc.ViewPage, System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" userControlBaseType="System.Web.Mvc.ViewUserControl, System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" validateRequest="False">             <controls>                 <add assembly="System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" namespace="System.Web.Mvc" tagPrefix="mvc" />   Secondly, if you return Json objects from an ajax call via the GET method you ahve several options to fix this depending on your situation: 1. The simplest, as in my case I did this for an internal web app, you may simply do:             return Json(myObject, JsonRequestBehavior.AllowGet);   2. In Mvc if you have a controller base you could wrap the Json method with:         public new JsonResult Json(object data)         {             return Json(data, "application/json", JsonRequestBehavior.AllowGet);                    }   3. The most work would be to decorate your Actions with:         [AcceptVerbs(HttpVerbs.Get)]   4. Another tnat is also a lot of work that needs to be done to every ajax call returning Json is:                             msg = $.ajax({ url: $('#ajaxGetSampleUrl').val(), dataType: 'json', type: 'POST', async: false, data: { name: theClass }, success: function(data, result) { if (!result) alert('Failure to retrieve the Sample Data.'); } }).responseText;   This should cover all the issues you may run into when upgrading. Let me kow if you run into any other ones.

    Read the article

  • Partner Blog Series: PwC Perspectives - The Gotchas, The Do's and Don'ts for IDM Implementations

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:12.0pt; mso-para-margin-left:0in; line-height:12.0pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Arial","sans-serif"; mso-ascii-font-family:Arial; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Arial; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} table.MsoTableMediumList1Accent6 {mso-style-name:"Medium List 1 - Accent 6"; mso-tstyle-rowband-size:1; mso-tstyle-colband-size:1; mso-style-priority:65; mso-style-unhide:no; border-top:solid #E0301E 1.0pt; mso-border-top-themecolor:accent6; border-left:none; border-bottom:solid #E0301E 1.0pt; mso-border-bottom-themecolor:accent6; border-right:none; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Georgia","serif"; color:black; mso-themecolor:text1; mso-ansi-language:EN-GB;} table.MsoTableMediumList1Accent6FirstRow {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:first-row; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-border-top:cell-none; mso-tstyle-border-bottom:1.0pt solid #E0301E; mso-tstyle-border-bottom-themecolor:accent6; font-family:"Verdana","sans-serif"; mso-ascii-font-family:Georgia; mso-ascii-theme-font:major-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:major-fareast; mso-hansi-font-family:Georgia; mso-hansi-theme-font:major-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:major-bidi;} table.MsoTableMediumList1Accent6LastRow {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:last-row; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-border-top:1.0pt solid #E0301E; mso-tstyle-border-top-themecolor:accent6; mso-tstyle-border-bottom:1.0pt solid #E0301E; mso-tstyle-border-bottom-themecolor:accent6; color:#968C6D; mso-themecolor:text2; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableMediumList1Accent6FirstCol {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:first-column; mso-style-priority:65; mso-style-unhide:no; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableMediumList1Accent6LastCol {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:last-column; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-border-top:1.0pt solid #E0301E; mso-tstyle-border-top-themecolor:accent6; mso-tstyle-border-bottom:1.0pt solid #E0301E; mso-tstyle-border-bottom-themecolor:accent6; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableMediumList1Accent6OddColumn {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:odd-column; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-shading:#F7CBC7; mso-tstyle-shading-themecolor:accent6; mso-tstyle-shading-themetint:63;} table.MsoTableMediumList1Accent6OddRow {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:odd-row; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-shading:#F7CBC7; mso-tstyle-shading-themecolor:accent6; mso-tstyle-shading-themetint:63;} Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:12.0pt; mso-para-margin-left:0in; line-height:12.0pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Arial","sans-serif"; mso-ascii-font-family:Arial; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Arial; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} table.MsoTableMediumList1Accent6 {mso-style-name:"Medium List 1 - Accent 6"; mso-tstyle-rowband-size:1; mso-tstyle-colband-size:1; mso-style-priority:65; mso-style-unhide:no; border-top:solid #E0301E 1.0pt; mso-border-top-themecolor:accent6; border-left:none; border-bottom:solid #E0301E 1.0pt; mso-border-bottom-themecolor:accent6; border-right:none; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Georgia","serif"; color:black; mso-themecolor:text1; mso-ansi-language:EN-GB;} table.MsoTableMediumList1Accent6FirstRow {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:first-row; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-border-top:cell-none; mso-tstyle-border-bottom:1.0pt solid #E0301E; mso-tstyle-border-bottom-themecolor:accent6; font-family:"Arial Narrow","sans-serif"; mso-ascii-font-family:Georgia; mso-ascii-theme-font:major-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:major-fareast; mso-hansi-font-family:Georgia; mso-hansi-theme-font:major-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:major-bidi;} table.MsoTableMediumList1Accent6LastRow {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:last-row; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-border-top:1.0pt solid #E0301E; mso-tstyle-border-top-themecolor:accent6; mso-tstyle-border-bottom:1.0pt solid #E0301E; mso-tstyle-border-bottom-themecolor:accent6; color:#968C6D; mso-themecolor:text2; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableMediumList1Accent6FirstCol {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:first-column; mso-style-priority:65; mso-style-unhide:no; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableMediumList1Accent6LastCol {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:last-column; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-border-top:1.0pt solid #E0301E; mso-tstyle-border-top-themecolor:accent6; mso-tstyle-border-bottom:1.0pt solid #E0301E; mso-tstyle-border-bottom-themecolor:accent6; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableMediumList1Accent6OddColumn {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:odd-column; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-shading:#F7CBC7; mso-tstyle-shading-themecolor:accent6; mso-tstyle-shading-themetint:63;} table.MsoTableMediumList1Accent6OddRow {mso-style-name:"Medium List 1 - Accent 6"; mso-table-condition:odd-row; mso-style-priority:65; mso-style-unhide:no; mso-tstyle-shading:#F7CBC7; mso-tstyle-shading-themecolor:accent6; mso-tstyle-shading-themetint:63;} It is generally accepted among business communities that technology by itself is not a silver bullet to all problems, but when it is combined with leading practices, strategy, careful planning and execution, it can create a recipe for success. This post attempts to highlight some of the best practices along with dos & don’ts that our practice has accumulated over the years in the identity & access management space in general, and also in the context of R2, in particular. Best Practices The following section illustrates the leading practices in “How” to plan, implement and sustain a successful OIM deployment, based on our collective experience. Planning is critical, but often overlooked A common approach to planning an IAM program that we identify with our clients is the three step process involving a current state assessment, a future state roadmap and an executable strategy to get there. It is extremely beneficial for clients to assess their current IAM state, perform gap analysis, document the recommended controls to address the gaps, align future state roadmap to business initiatives and get buy in from all stakeholders involved to improve the chances of success. When designing an enterprise-wide solution, the scalability of the technology must accommodate the future growth of the enterprise and the projected identity transactions over several years. Aligning the implementation schedule of OIM to related information technology projects increases the chances of success. As a baseline, it is recommended to match hardware specifications to the sizing guide for R2 published by Oracle. Adherence to this will help ensure that the hardware used to support OIM will not become a bottleneck as the adoption of new services increases. If your Organization has numerous connected applications that rely on reconciliation to synchronize the access data into OIM, consider hosting dedicated instances to handle reconciliation. Finally, ensure the use of clustered environment for development and have at least three total environments to help facilitate a controlled migration to production. If your Organization is planning to implement role based access control, we recommend performing a role mining exercise and consolidate your enterprise roles to keep them manageable. In addition, many Organizations have multiple approval flows to control access to critical roles, applications and entitlements. If your Organization falls into this category, we highly recommend that you limit the number of approval workflows to a small set. Most Organizations have operations managed across data centers with backend database synchronization, if your Organization falls into this category, ensure that the overall latency between the datacenters when replicating the databases is less than ten milliseconds to ensure that there are no front office performance impacts. Ingredients for a successful implementation During the development phase of your project, there are a number of guidelines that can be followed to help increase the chances for success. Most implementations cannot be completed without the use of customizations. If your implementation requires this, it’s a good practice to perform code reviews to help ensure quality and reduce code bottlenecks related to performance. We have observed at our clients that the development process works best when team members adhere to coding leading practices. Plan for time to correct coding defects and ensure developers are empowered to report their own bugs for maximum transparency. Many organizations struggle with defining a consistent approach to managing logs. This is particularly important due to the amount of information that can be logged by OIM. We recommend Oracle Diagnostics Logging (ODL) as an alternative to be used for logging. ODL allows log files to be formatted in XML for easy parsing and does not require a server restart when the log levels are changed during troubleshooting. Testing is a vital part of any large project, and an OIM R2 implementation is no exception. We suggest that at least one lower environment should use production-like data and connectors. Configurations should match as closely as possible. For example, use secure channels between OIM and target platforms in pre-production environments to test the configurations, the migration processes of certificates, and the additional overhead that encryption could impose. Finally, we ask our clients to perform database backups regularly and before any major change event, such as a patch or migration between environments. In the lowest environments, we recommend to have at least a weekly backup in order to prevent significant loss of time and effort. Similarly, if your organization is using virtual machines for one or more of the environments, it is recommended to take frequent snapshots so that rollbacks can occur in the event of improper configuration. Operate & sustain the solution to derive maximum benefits When migrating OIM R2 to production, it is important to perform certain activities that will help achieve a smoother transition. At our clients, we have seen that splitting the OIM tables into their own tablespaces by categories (physical tables, indexes, etc.) can help manage database growth effectively. If we notice that a client hasn’t enabled the Oracle-recommended indexing in the applicable database, we strongly suggest doing so to improve performance. Additionally, we work with our clients to make sure that the audit level is set to fit the organization’s auditing needs and sometimes even allocate UPA tables and indexes into their own table-space for better maintenance. Finally, many of our clients have set up schedules for reconciliation tables to be archived at regular intervals in order to keep the size of the database(s) reasonable and result in optimal database performance. For our clients that anticipate availability issues with target applications, we strongly encourage the use of the offline provisioning capabilities of OIM R2. This reduces the provisioning process for a given target application dependency on target availability and help avoid broken workflows. To account for this and other abnormalities, we also advocate that OIM’s monitoring controls be configured to alert administrators on any abnormal situations. Within OIM R2, we have begun advising our clients to utilize the ‘profile’ feature to encapsulate multiple commonly requested accounts, roles, and/or entitlements into a single item. By setting up a number of profiles that can be searched for and used, users will spend less time performing the same exact steps for common tasks. We advise our clients to follow the Oracle recommended guides for database and application server tuning which provides a good baseline configuration. It offers guidance on database connection pools, connection timeouts, user interface threads and proper handling of adapters/plug-ins. All of these can be important configurations that will allow faster provisioning and web page response times. Many of our clients have begun to recognize the value of data mining and a remediation process during the initial phases of an implementation (to help ensure high quality data gets loaded) and beyond (to support ongoing maintenance and business-as-usual processes). A successful program always begins with identifying the data elements and assigning a classification level based on criticality, risk, and availability. It should finish by following through with a remediation process. Dos & Don’ts Here are the most common dos and don'ts that we socialize with our clients, derived from our experience implementing the solution. Dos Don’ts Scope the project into phases with realistic goals. Look for quick wins to show success and value to the stake holders. Avoid “boiling the ocean” and trying to integrate all enterprise applications in the first phase. Establish an enterprise ID (universal unique ID across the enterprise) earlier in the program. Avoid major UI customizations that require code changes. Have a plan in place to patch during the project, which helps alleviate any major issues or roadblocks (product and database). Avoid publishing all the target entitlements if you don't anticipate their usage during access request. Assess your current state and prepare a roadmap to address your operations, tactical and strategic goals, align it with your business priorities. Avoid integrating non-production environments with your production target systems. Defer complex integrations to the later phases and take advantage of lessons learned from previous phases Avoid creating multiple accounts for the same user on the same system, if there is an opportunity to do so. Have an identity and access data quality initiative built into your plan to identify and remediate data related issues early on. Avoid creating complex approval workflows that would negative impact productivity and SLAs. Identify the owner of the identity systems with fair IdM knowledge and empower them with authority to make product related decisions. This will help ensure overcome any design hurdles. Avoid creating complex designs that are not sustainable long term and would need major overhaul during upgrades. Shadow your internal or external consulting resources during the implementation to build the necessary product skills needed to operate and sustain the solution. Avoid treating IAM as a point solution and have appropriate level of communication and training plan for the IT and business users alike. Conclusion In our experience, Identity programs will struggle with scope, proper resourcing, and more. We suggest that companies consider the suggestions discussed in this post and leverage them to help enable their identity and access program. This concludes PwC blog series on R2 for the month and we sincerely hope that the information we have shared thus far has been beneficial. For more information or if you have questions, you can reach out to Rex Thexton, Senior Managing Director, PwC and or Dharma Padala, Director, PwC. We look forward to hearing from you. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:12.0pt; mso-para-margin-left:0in; line-height:12.0pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Arial","sans-serif"; mso-ascii-font-family:Arial; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Arial; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Meet the Writers: Dharma Padala is a Director in the Advisory Security practice within PwC.  He has been implementing medium to large scale Identity Management solutions across multiple industries including utility, health care, entertainment, retail and financial sectors.   Dharma has 14 years of experience in delivering IT solutions out of which he has been implementing Identity Management solutions for the past 8 years. Praveen Krishna is a Manager in the Advisory Security practice within PwC.  Over the last decade Praveen has helped clients plan, architect and implement Oracle identity solutions across diverse industries.  His experience includes delivering security across diverse topics like network, infrastructure, application and data where he brings a holistic point of view to problem solving. Scott MacDonald is a Director in the Advisory Security practice within PwC.  He has consulted for several clients across multiple industries including financial services, health care, automotive and retail.   Scott has 10 years of experience in delivering Identity Management solutions. John Misczak is a member of the Advisory Security practice within PwC.  He has experience implementing multiple Identity and Access Management solutions, specializing in Oracle Identity Manager and Business Process Engineering Language (BPEL).

    Read the article

  • Oracle Manageability Presentations at Collaborate 2012

    - by Get_Specialized!
    Attending the Collaborate 2012 event, April 22-26th in Las Vegas, and interested in learning more about becoming specialized on Oracle Manageability? Be sure and checkout these sessions below presented by subject matter experts while your onsite. Set up a meeting or be one of the first Oracle Partners onsite to ask me, and we'll request one of the limited FREE Oracle Enterprise Manager 12c partner certification exam vouchers for you. Can't travel this year? the  COLLABORATE 12 Plug Into Vegas may be another option for you to attend from your own desk presentations like session #489 Oracle Enterprise Manager 12c: What's Changed? What's New? presented by Oracle Specialized Partners like ROLTA   Session ID Title Presented by Day/Time 920 Enterprise Manager 12c Cloud Control: New Features and Best Practices Dell Sun 9536 Release 12 Apps DBA 101 Justadba, LLC Mon 932 Monitoring Exadata with Cloud Control Oracle Mon 397 OEM Cloud Control Hands On Performance Tuning Mon 118 Oracle BI Sys Mgmt Best Practices & New Features Rittman Mead Consulting Mon 548 High Availability Boot Camp: RAC Design, Install, Manage Database Administration, Inc Mon 926 The Only Complete Cloud Management Solution -- Oracle Enterprise Manager Oracle Mon 328 Virtualization Boot Camp Dell Mon 292 Upgrading to Oracle Enterprise Manager 12c - Best Practices Southern Utah University Mon 793 Exadata 101 - What You Need to Know Rolta Tues 431 & 1431 Extreme Database Administration: New Features for Expert DBAs Oracle Tue Wed 521 What's New for Oracle WebLogic Management: Capabilities that Scripting Cannot Provide Oracle Thu 338 Oracle Real Application Testing: A look under the hood PayPal Tue 9398 Reduce TCO Using Oracle Application Management Suite for Oracle E-Business Suite Oracle Tue 312 Configuring and Managing a Private Cloud with Oracle Enterprise Manager 12c Dell Tue 866 Making OEM Sing and Dance with EMCLI Portland General Electric Tue 533 Oracle Exadata Monitoring: Engineered Systems Management with Oracle Enterprise Manager Oracle Wed 100600 Optimizing EnterpriseOne System Administration Oracle Wed 9565 Optimizing EBS on Exadata Centroid Systems Wed 550 Database-as-a-Service: Enterprise Cloud in Three Simple Steps Oracle Wed 434 Managing Oracle: Expert Panel on Techniques and Best Practices Oracle Partners: Dell, Keste, ROLTA, Pythian Wed 9760 Cloud Computing Directions: Understanding Oracle's Cloud AT&T Wed 817 Right Cloud: Use Oracle Technologies to Avoid False Cloud Visual Integrator Consulting Wed 163 Forgetting something? Standardize your database monitoring environment with Enterprise Manager 11g Johnson Controls Wed 489 Oracle Enterprise Manager 12c: What's Changed? What's New? ROLTA Thu    

    Read the article

  • links for 2011-01-31

    - by Bob Rhubart
    Do (Software) Architects Architect? "The first question, is 'Why is architect being used as a verb?' Mirriam-Webster dictionary does not contain a definition of architect as a verb, nor do many other recognized dictionaries." -- TheCPUWizard (tags: softwarearchitecture) Oracle Business Intelligence Blog: Gartner Magic Quadrant for BI Platforms 2011 "Oracle customers indicate they deploy the Oracle Business Intelligence Suite Enterprise Edition (OBIEE) platform to support among the most complex deployments in our survey." - Gartner (tags: oracle businessintelligence gartner) Oracle BI Server Modeling, Part 1- Designing a Query Factory (Oracle BI Foundation) Bob Ertl lays the groundwork for Business Intelligence modeling concepts with a look at "the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing." (tags: oracle otn businessintelligence) Tom Graves: Modelling people in enterprise-architecture Tom says: "One of the key characteristics of ‘crossing the chasm’ to a viable whole-of-enterprise architecture is the explicit inclusion of people. In short, we need to be able to model and map where people fit in relation to the architecture. But there’s a catch. A big catch." (tags: entarch) Java developer webcasts for customers and partners (SOA Partner Community Blog) Jurgen Kress shares info on several upcoming online events focused on WebLogic. (tags: weblogic oracle otn soa) Business SOA: Data Services are bogus, Information services are real Steve Jones says: "The other day when I was talking about MDM a bright spark pointed out that I hated data services but wasn't MDM just about data services?" (tags: SOA MDM) Andrejus Baranovskis's Blog: Configuring Missing Contribution Folders for Oracle UCM 11g and WebCenter 11g PS3 Andrejus says: "After doing some research on UCM, we found that Folders_g component must be configured in UCM, for Contribution Folders to be enabled." (tags: oracle otn oracleace UCM webcenter enterprise2.0) Wim Coekaerts: Converting an Oracle VM VirtualBox VM into an Oracle VM Server image Wim Coekaerts offers a few simple steps to convert an existing Oracle VM VirtualBox image.  (tags: oracle otn virtualization virtualbox) Stefan Hinker: Secure Deployment of Oracle VM Server for SPARC This new paper from Stefan Hinker will help you understand the general security concerns in virtualized environments as well as the specific additional threats that arise out of them. (tags: oracle otn SPARC virtualization enterprisearchitecture) The EA Roadmap to Rationalize, Standardize, and Consolidate the IT Portfolio Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. (tags: entarch oracle otn)

    Read the article

  • 6 Ways to Modernize Your Customer Experience

    - by Mike Stiles
    If customers have changed, if the way they research and shop have changed, if their expectations have changed, if their ability to act on dissatisfaction has changed, but your customer experience has NOT changed, what was once “good enough” may now be crippling. Well, the customer has changed, and why wouldn’t they? You’ve probably changed too in your role as consumer. There’s more info available, it’s easier to get, there’s more choice, you’re more mobile, you’re more connected, it’s easier to buy, and yes, it’s easier to switch brands if experiences don’t meet your now higher expectations. Thanks to technological advances, we as marketers can increasingly work borderline miracles. But if we’re still not adamantly adopting customer centricity, and if we aren’t making the customer experience paramount amongst business goals, the tech is wasted. A far more modern customer experience is called for. Here are 6 ways to get there: 1. Modern Marketing: Marketing data is aggregated and targeted to the right customers, who are getting personal, relevant communications. In return, you’re getting insight that finally properly attributes revenue to your marketing efforts. 2. Modern Selling: Demand is being driven across all channels with modern selling tools. Productivity is up thanks to coordinated communication and selling, and performance is ever optimized using powerful analytics. 3. Modern CPQ: You’re cross-selling and upselling more effectively since reps and channel partners have been empowered with the ability to quickly, automatically generate 100% accurate, customer-friendly quotes complete with price controls and automated approvals. 4. Modern Commerce: You’re leveraging data and delivering personalized, targeted digital experiences to everyone. You’re attracting more visitors, and you’re able to scale and keep up with the market and control the experience. 5. Modern Service: You’re better serving your customers by making it easier for them to engage with your brand, plus you’re lowering your costs by increasing agent and tech support efficiencies. 6. Modern Social: You’re getting faster, deeper, more accurate insights from social and turning content around faster, which then goes out to the right people at the right time in the right place. You’ve also gotten proactive in your service, and customers love that. For far too many brands, the buying journey of Need, Research, Select, Buy, Use, Recommend across the multiple connect points of Social, Mobile, Store, Call Center, Site, Ecommerce is a disconnected mess. Oracle’s approach to CX is to connect every interaction your customer has with your brand, avoiding the revenue losses lousy customer experiences bring. How important is the experience to customers? 94% are willing to pay more of their hard-earned money to have better ones, while a meager 1% say they get the good, consistent experiences they expect. Brands, your words aren’t as loud anymore, so your actions as they relate to customer experience are going to have to do the talking. @mikestiles @oraclesocialPhoto: Julien Tromeur, freeimages.com

    Read the article

  • Simple collision detection in Unity 2D

    - by N1ghtshade3
    I realise other posts exist with this topic yet none have gone into enough detail for me. I am attempting to create a 2D game in Unity using C# as my scripting language. Basically I have two objects, player and bomb. Both were created simply by dragging the respective PNG to the stage. I have set up touch controls to move player left and right; gravity of any kind is not needed as I only require it to move x units when I tap either the left or right side of the screen. This movement is stored in a script called playerController.cs and works just fine. I also have a variable health = 3 for player, which is stored in healthScript.cs. I am now at a point where I am stuck. I would like it so that when player collides with bomb, health decreases by one and the bomb object is destroyed. So what I tried doing is using a new script called playerPhysics.cs, I added the following: void OnCollisionEnter2D(Collision2D coll){ if(coll.gameObject.name=="bomb") GameObject.Destroy("bomb"); healthScript.health -= 1; } While I'm fairly sure I don't know the proper way to reference a variable in another script and that's why the health didn't decrease when I collided, bomb never disappeared from the stage so I'm thinking there's also a problem with my collision. Initially, I had simply attached playerPhysics.cs to player. After searching around though, it appeared as though player also needed a rigidBody attached to it, so I did that. Still no luck. I tried using a circleCollider (player is a circle), using a rigidBody2D, and using all manner of colliders on one and/or both of the objects. If you could please explain what colliders (if any) should be attached to which objects and whether I need to change my script(s), that would be much more helpful than pointing me to one of the generic documentation examples I've already read. Also, if it would be simple to fix the health thing not working that would be an added bonus but not exactly the focus of this question. Bear in mind that this game is 2D; I'm not sure if that changes anything. Thanks!

    Read the article

  • Interconnect nodes in a Java distributed infrastructure for tweet processing

    - by David Moreno García
    I'm working in a new version of an old project that I used to download and process user statuses from Twitter. The main problem of that project was its infrastructure. I used multiple instances of a java application (trackers) to download from Twitter given an specific task (basically terms to search for), connected with a central node (a web application) that had to process all tweets once per day and generate a new task for each trackers once each 15 minutes. The central node also had to monitor all trackers and enable/disable them under user petition. This, as I said, was too slow because I had multiple bottlenecks, so in this new version I want to improve the infrastructure and isolate all functionalities in specific nodes. I also need a good notification system to receive notifications for any node. So, in the next diagram I show the components that I'll need in this new version: As you can see, there are more nodes. Here are some notes about them: Dashboard: Controls trackers statuses and send a single task to each of them (under user request). The trackers will use this task until replaced with a new one (if done, not each 15 minutes like before). Search engine: I need to store all the tweets. They are firstly stored in a local database for each tracker but after that I'm thinking on using something like Elasticsearch to be able to do fast searches. Tweet processor: Just and isolated component with its own database (maybe something like the search engine to have fast access to info generated by the module). In the future more could be added. Application UI: A web application with a shared database with the Dashboard (mainly to store users information and preferences). Indeed, both could be merged into a single web. The main difference with the previous version of the project is that now they will be isolated and they will only show information and send requests. I will not do any heavy task in them (like process tweets as I did before). So, having this components, my main headache is how to structure all to not have to rewrite a lot of code every time I need to access any new data. Another headache is how can I interconnect nodes. I could use sockets but that is a pain in the ass. Maybe a REST layer? And finally, if all the nodes are isolated, how could I generate notifications for each user which info is only in the database used by the Application UI? I'm programming this using Java and Spring (at least I used them in the last version) but I have no problems with changing the language if I can take advantage of a tool/library/engine to make my life easier and have a better platform. Any comment will be appreciated.

    Read the article

  • Web Development Goes Pre-Visual InterDev

    - by Ken Cox [MVP]
    As a longtime and hardcore ASP.NET webforms developer, I’m finding the new client-side development world a bit of a grind.  I love learning new technologies, but I can’t help feeling we’ve regressed and lost our old RAD advantage as we move heavy lifting to the client. For my latest project, I’m using Telerik’s KendoUI in Visual Studio 2012. To say I feel clumsy writing this much JavaScript is an understatement. It seems like the only safe way to ‘write’ this code is by copying a working snippet from someone else and pasting it into my HTML page.  For me, JavaScript has largely been for small UI tasks like client-side validation and a bit of AJAX – and often emitted by a server-side control. I find myself today lost in nests of curly braces that Ctrl+K, Ctrl+D doesn’t seem to understand that well either. IntelliSense, my old syntax saviour, doesn’t seem to have kept up with this cobweb of code either. Code completion? Not seeing it. As I fumbled about this evening, I thought about how web development rocketed forward when Microsoft introduced Visual InterDev. Its Design-Time Controls (DTCs) changed the way we created sites. All the iterations of Visual Studio have enhanced that server-side experience where you let a tool write the bulk of the code and manually finesse it from there. What happened? Why am I typing  properties and values (especially default values!) into VS 2012 to get a client-side grid on a page? Where are the drag and drop objects that traditionally provided 70 percent of the mark-up and configuration?  Did we forget how to write Property Pages where you enter a value and the correct syntax appears magically in the source code? To me, the tooling was looking the other way as the scene shifted from server-side code to nimble client-side script. It’ll have to catch up. Although JavaScript is the lingua franca of web browsers, the language is unwieldy, tough to maintain, and messy to debug. If a .NET JIT compiler can turn our VB, F#, and C# source code into an Intermediate Language that executes on a computer, I don’t see why there can’t be a client-side compiler that turns a .NET language into JavaScript that browsers can consume.

    Read the article

  • Windows8, JavaScript and HTML5 - A good thing?

    - by Albers
    Most of us have seen the Windows 8 news regarding support for native HTML5/JavaScript applications. The press has pushed this as a potential threat to the .NET developer community because JavaScript and HTML5 were called "our new developer platform". The press release refers to "Web-connected and Web-powered apps built using HTML5 and JavaScript that have access to the full power of the PC.".Microsoft has also been hush on details related to these comments. Before we buy the hype and start worrying about a world where we drop our Visual Studio licenses and buy DreamWeaver - let's think about how Windows 8 HTML/JavaScript applications would be implemented. The HTML5 spec offers support for offline applications, but this won't offer the OS-integrated experience the press release refers to. MS has to be planning a way to extend access beyond the traditional JavaScript feature set. Microsoft has a similar option today: HTML Applications or HTAs. They come close to required features, but HTAs need ActiveX or Java integration to provide the promised OS-level access. I'm guessing that Microsoft's future OS strategy isn't built on developers cranking out ActiveX controls or Java applets. So where is Microsoft headed? One possibility is that MS builds a new JavaScript framework from the ground up outside their current APIs. Another idea would be for Microsoft to add support for JavaScript as a first class .NET language using the Dynamic Language Runtime. A solution based on the DLR could be integrated into an HTA-like model to provide the promised access, along with the full range of features in .NET Framework. Security comes included in the Framework. And the work necessary to support this integration would tie in nicely with the effort MS has recently made providing better JavaScript and HTML5 support in Visual Studio 2010. As a bonus, a full-fledged JavaScript DLR implementation would allow single language web solutions across client and server (think node.js) and would appeal to developers who are familiar with JavaScript but have less experience with the Microsoft tech stack. We will all get a better picture after the Build conference in September. But in the mean time we know that Microsoft has a reputation for providing strong developer support. We might want to reserve our harshest judgement and consider that the press release could hint at new opportunities for .NET development.

    Read the article

  • WebCenter Customer Spotlight: Guizhou Power Grid Company

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryGuizhou Power Grid Company is responsible for power grid planning, construction, management, and power distribution in Guizhou Province, serving 39 million people. Giuzhou has 49,823 employees and an annual revenue of over $5 Billion. The business objectives were to consolidate information contained in disparate systems into a single knowledge repository and provide a safe and efficient way for staff and managers to access, query, share, manage, and store business information. Guizhou Power Grid Company saved more than US$693,000 in storage costs, reduced  average search times from 180 seconds to 5 seconds and solved 80% to 90% of technology and maintenance issues by searching the Oracle WebCenter Content management system. Company OverviewA wholly owned subsidiary of China Southern Power Grid Company Limited, Guizhou Power Grid Company is responsible for power grid planning, construction, management, and power distribution in Guizhou Province, serving 39 million people. Giuzhou has 49,823 employees and an annual revenue of over $5 Billion. Business ChallengesThe business objectives were to consolidate information contained in disparate systems, such as the customer relationship management and power grid management systems, into a single knowledge repository and provide a safe and efficient way for staff and managers to access, query, share, manage, and store business information. Solution DeployedGuizhou Power Grid Company  implemented Oracle WebCenter Content to build a content management system that enabled the secure, integrated management and storage of information, such as documents, records, images, Web content, and digital assets. The content management solution was integrated with the power grid, customer service, maintenance, and other business systems, as well as the corporate Web site. Business Results Saved more than US$693,000 in storage costs and shortened the material distribution time by integrating the knowledge management solution with the power grid, customer service, maintenance, and other business systems, as well as the corporate Web site Enabled staff to search 31,650 documents using catalogs, multidimensional attributes, and knowledge maps, reducing average search times from 180 seconds to 5 seconds and saving approximately 1,539 hours in annual search time Gained comprehensive document management, format transformation, security, and auditing capabilities Enabled users to upload new documents and supervisors to check the accuracy of these documents online, resulting in improved information quality control Solved 80% to 90% of technology and maintenance issues by searching the Oracle content management system for information, ensuring IT staff can respond quickly to users’ technical problems Improved security by using role-based access controls to restrict access to confidential documents and information Supported the efficient classification of corporate knowledge by using Oracle’s metadata functions to collect, tag, and archive documents, images, Web content, and digital assets “We chose Oracle WebCenter Content, as it is an outstanding integrated content management platform. It has allowed us to establish a system to access, query, share, manage, and store our corporate assets. This has laid a solid foundation for Guizhou Power Grid Company to improve management practices.” Luo Sixi, Senior Information Consultant, Guizhou Power Grid Company Additional Information Guizhou Power Grid Company Customer Snapshot Oracle WebCenter Content

    Read the article

  • ITT Corporation Goes Live on Oracle Sales and Marketing Cloud Service (Fusion CRM)!

    - by Richard Lefebvre
    Back in Q2 of FY12, a division of ITT invited Oracle to demo our CRM On Demand product while the group was considering Salesforce.com. Chris Porter, our Oracle Direct sales representative learned the players and their needs and began to develop relationships. We lost that deal, but not Chris's persistence. A few months passed and Chris called on the ITT Shape Cutting Division's Director of Sales to see how things were going. Chris was told that the plan was for the division to buy more Salesforce.com. In fact, he informed Chris that he had just sent his team to Salesforce.com training. During the conversation, Chris mentioned that our new Oracle Sales Cloud Service could run with Outlook. This caused the ITT Sales Director to reconsider the plan to move forward with our competition. Oracle was invited back to demo the Oracle Sales and Marketing Cloud Service (Fusion CRM) and after it concluded, the Director stated, "That just blew your competition away." The deal closed on June 5th , 2012 Our Oracle Platinum Partner, Intelenex, began the implementation with ITT on July 30th. We are happy to report that on September 18th, the ITT Shape Cutting Division successfully went live on Oracle Sales and Marketing Cloud Service (Fusion CRM). About: ITT is a diversified leading manufacturer of highly engineered critical components and customized technology solutions for growing industrial end-markets in energy infrastructure, electronics, aerospace and transportation. Building on its heritage of innovation, ITT partners with its customers to deliver enduring solutions to the key industries that underpin our modern way of life. Founded in 1920, ITT is headquartered in White Plains, NY, with 8,500 employees in more than 30 countries and sales in more than 125 countries. The ITT Shape Cutting Division provides plasma lasers and controls with the Burny, Kaliburn, and AMC brands. Oracle Fusion Products: Oracle Sales and Marketing Cloud Service (Fusion CRM) including: • Fusion CRM Base • Fusion Sales Cloud • Fusion Mobile and Desktop Integration • Automated Forecasting Adoption Model: SaaS Partner: Intelenex Business Drivers: The ITT Shape Cutting Division wanted to: better enable its Sales Force with email and mobile CRM capabilities simplify and automate its complex sales processes centrally manage and maintain customer contact information Why We Won: ITT was impressed with the feature-rich capabilities of Oracle Sales and Marketing Cloud Service (Fusion CRM), including sales performance management and integration. The company also liked the product's flexibility and scalability for future growth. Expected Benefits: Streamlined accurate forecasting Increased customer manageability Improved sales performance Better visibility to customer information

    Read the article

  • Halloween: Season for Java Embedded Internet of Spooky Things (IoST) (Part 4)

    - by hinkmond
    And now here's the Java code that you'll need to read your ghost sensor on your Raspberry Pi The general idea is that you are using Java code to access the GPIO pin on your Raspberry Pi where the ghost sensor (JFET trasistor) detects minute changes in the electromagnetic field near the Raspberry Pi and will change the GPIO pin to high (+3 volts) when something is detected, otherwise there is no value (ground). Here's that Java code: try { /*** Init GPIO port(s) for input ***/ // Open file handles to GPIO port unexport and export controls FileWriter unexportFile = new FileWriter("/sys/class/gpio/unexport"); FileWriter exportFile = new FileWriter("/sys/class/gpio/export"); for (String gpioChannel : GpioChannels) { System.out.println(gpioChannel); // Reset the port File exportFileCheck = new File("/sys/class/gpio/gpio"+gpioChannel); if (exportFileCheck.exists()) { unexportFile.write(gpioChannel); unexportFile.flush(); } // Set the port for use exportFile.write(gpioChannel); exportFile.flush(); // Open file handle to input/output direction control of port FileWriter directionFile = new FileWriter("/sys/class/gpio/gpio" + gpioChannel + "/direction"); // Set port for input directionFile.write(GPIO_IN); } /*** Read data from each GPIO port ***/ RandomAccessFile[] raf = new RandomAccessFile[GpioChannels.length]; int sleepPeriod = 10; final int MAXBUF = 256; byte[] inBytes = new byte[MAXBUF]; String inLine; int zeroCounter = 0; // Get current timestamp with Calendar() Calendar cal; DateFormat dateFormat = new SimpleDateFormat("yyyy/MM/dd HH:mm:ss.SSS"); String dateStr; // Open RandomAccessFile handle to each GPIO port for (int channum=0; channum And, then we just load up our Java SE Embedded app, place each Raspberry Pi with a ghost sensor attached in strategic locations around our Santa Clara office (which apparently is very haunted by ghosts from the Agnews Insane Asylum 1906 earthquake), and watch our analytics for any ghosts. Easy peazy. See the previous posts for the full series on the steps to this cool demo: Halloween: Season for Java Embedded Internet of Spooky Things (IoST) (Part 1) Halloween: Season for Java Embedded Internet of Spooky Things (IoST) (Part 2) Halloween: Season for Java Embedded Internet of Spooky Things (IoST) (Part 3) Halloween: Season for Java Embedded Internet of Spooky Things (IoST) (Part 4) Hinkmond

    Read the article

  • SharePoint Thoughts

    - by Tim Murphy
    I was listening to .NET Rocks episode #713 and it got me thinking about a number of SharePoint related topics. I have been working with SharePoint since the 2001 product came out and have watched it evolve over the years.  Today SharePoint is one of the most powerful and flexible products in the market.  Of course that doesn’t mean there isn’t room for improvement (a lot of improvement in fact) and with much power comes much responsibility. My main gripe these days is that you have to develop on a server instance.  This adds a real barrier to entry for developers.  You either have to run VMWare or Hyper-V on your developer machine or actually develop on your dev server for most tasks.  Yes, there is a way to setup a Windows 7 machine with the SharePoint components but it is very hackish. Beyond that the tools in VS2010 are a great leap forward from past generations.  Not requiring a separate package creation tool is not the least of the improvements.  Better workflow and web part development have also eased the burden of many developers. The other thing the show brought up in my thoughts was more around usage.  Users want to be able to self server everything without regard to what affect that has on leveraging their data from a corporate perspective.  My coworkers who work on Lotus Notes ask why the user can’t just do what ever they want? Part of the reason is that those features have not been built, but the other part is that giving them those features is often like giving an infant a loaded hand gun.  You can do it but it doesn’t make it the smart thing to do. As with any tool that is going to be used in the enterprise it should be subject to governance.  If controls are not in place as they said in the episode of DNR the document libraries and I believe SharePoint in general starts to look as disarrayed and unusable as a shared drive.  Consider these factors before giving into every whim of the users.  You should be able to explain to them the tradeoffs of giving them full control versus being able to leverage the information they collect to the benefit of the organization. These are just a couple of the thoughts that were triggered by the show.  I’m sure there are more discussions that can be had.  Feel free to leave your comments about the pros and cons of SharePoint. del.icio.us Tags: .NET Rocks,SharePoint,software development

    Read the article

  • Actor and Sprite, who should own these properties?

    - by Gerardo Marset
    I'm writing sort of a 2D game engine for making the process of creating games easier. It has two classes, Actor and Sprite. Actor is used for interactive elements (the player, enemies, bullets, a menu, an invisible instance that controls score, etc) and Sprite is used for animated (or not) images with transparency (or not). The actor may have an assigned sprite that represents it on the screen, which may change during the game. E.g. in a top-down action game you may have an actor with a sprite of a little guy that changes when attacking, walking, and facing different directions, etc. Currently the actor has x and y properties (its coordinates in the screen), while the sprite has an index property (the number of the frame currently being shown by the sprite). Since the sprite doesn't know which actor it belongs to (or if it belongs to an actor at all), the actor must pass its x and y coordinates when drawing the sprite. Also, since a actors may reset its sprite each frame (and usually do), the sprite's index property must be passed from the old to the new sprite like so (pseudocode): function change_sprite(new_sprite) old_index = my.sprite.index my.sprite = new_sprite() my.sprite.index = old_index % my.sprite.frames end I always thought this was kind of cumbersome, but it never was a big problem. Now I decided to add support for more properties. Namely a property to draw the sprite rotated, a property to draw it flipped, it a property draw it stretched, etc. These should probably belong to the sprite and not the actor, but if they do, the actor would have to pass them from the old to the new sprite each time it changes... On the other hand, if they belonged to the actor, the actor would have to pass each property to the sprite when drawing it (since the sprite doesn't know which actor it belongs to, and it shouldn't, since sprites aren't just meant to be used by actors, really). Another option I thought of would be having an extra class that owns all these properties (plus index, x and y) and links an actor with a sprite, but that doesn't come without drawbacks. So, what should I do with all these properties? Thanks!

    Read the article

  • Running Built-In Test Simulator with SOA Suite Healthcare 11g in PS4 and PS5

    - by Shub Lahiri, A-Team
    Background SOA Suite for Healthcare Integration pack comes with a pre-installed simulator that can be used as an external endpoint to generate inbound and outbound HL7 traffic on specified MLLP ports. This is a command-line utility that can be very handy when trying to build a complete end-to-end demo within a standalone, closed environment. The ant-based utility accepts the name of a configuration file as the command-line input argument. The format of this configuration file has changed between PS4 and PS5. In PS4, the configuration file was XML based and in PS5, it is name-value property based. The rest of this note highlights these differences and provides samples that can be used to run the first scenario from the product samples set. PS4 - Configuration File The sample configuration file for PS4 is shown below. The configuration file contains information about the following items: Directory for incoming and outgoing files for the host running SOA Suite Healthcare Polling Interval for the directory External Endpoint Logical Names External Endpoint Server Host Name and Ports Message throughput to be simulated for generating outbound messages Documents to be handled by different endpoints A copy of this file can be downloaded from here. PS5 - Configuration File The corresponding sample configuration file for PS5 is shown below. The configuration file contains similar information about the sample scenario but is not in XML format. It has name-value pairs specified in the form of a properties file. This sample file can be downloaded from here. Simulator Configuration Before running the simulator, the environment has to be set by defining the proper ANT_HOME and JAVA_HOME. The following extract is taken from a working sample shell script to set the environment: Also, as a part of setting the environment, template jndi.properties and logging.properties can be generated by using the following ant command: ant -f ant-b2bsimulator-util.xml b2bsimulator-prop Sample jndi.properties and logging.properties are shown below and can be modified, as needed. The jndi.properties contains information about connectivity to the local Weblogic Managed Server instance and the logging.properties file controls the amount of logging that can be generated from the running simulator process. Simulator Usage - Start and Stop The command syntax to launch the simulator via ant is the same in PS4 and PS5. Only the appropriate configuration file has to be supplied as the command-line argument, for example: ant -f ant-b2bsimulator-util.xml b2bsimulatorstart -Dargs="simulator1.hl7-config.xml" This will start the simulator and will keep running to provide an active external endpoint for SOA Healthcare Integration engine. To stop the simulator, a similar ant command can be used, for example: ant -f ant-b2bsimulator-util.xml b2bsimulatorstop

    Read the article

  • Why does my ID3DXSprite appear to be incorrectly scaled?

    - by Bjoern
    I am using D3D9 for rendering some simple things (a movie) as the backmost layer, then on top of that some text messages, and now wanted to add some buttons to that. Before adding the buttons everything seemed to have worked fine, and I was using a ID3DXSprite for the text as well (ID3DXFont), now I am loading some graphics for the buttons, but they seem to be scaled to something like 1.2 times their original size. In my test window I centered the graphic, but it being too big it just doesnt fit well, for example the client area is 640x360, the graphic is 440, so I expect 100 pixel on left and right, left side is fine [I took screenshot and "counted" the pixels in photoshop], but on the right there is only about 20 pixels) My rendering code is very simple (I am omitting error checks, et cetera, for brevity) // initially viewport was set to width/height of client area // clear device m_d3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET|D3DCLEAR_STENCIL|D3DCLEAR_ZBUFFER, D3DCOLOR_ARGB(0,0,0,0), 1.0f, 0 ); // begin scene m_d3dDevice->BeginScene(); // render movie surface (just two triangles to which the movie is rendered) m_d3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE,false); m_d3dDevice->SetSamplerState( 0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR ); // bilinear filtering m_d3dDevice->SetSamplerState( 0, D3DSAMP_MINFILTER, D3DTEXF_LINEAR ); // bilinear filtering m_d3dDevice->SetTextureStageState( 0, D3DTSS_COLORARG1, D3DTA_TEXTURE ); m_d3dDevice->SetTextureStageState( 0, D3DTSS_COLORARG2, D3DTA_DIFFUSE ); //Ignored m_d3dDevice->SetTextureStageState( 0, D3DTSS_COLOROP, D3DTOP_SELECTARG1 ); m_d3dDevice->SetTexture( 0, m_movieTexture ); m_d3dDevice->SetStreamSource(0, m_displayPlaneVertexBuffer, 0, sizeof(Vertex)); m_d3dDevice->SetFVF(Vertex::FVF_Flags); m_d3dDevice->DrawPrimitive(D3DPT_TRIANGLELIST, 0, 2); // render sprites m_sprite->Begin(D3DXSPRITE_ALPHABLEND | D3DXSPRITE_SORT_TEXTURE | D3DXSPRITE_DO_NOT_ADDREF_TEXTURE); // text drop shadow m_font->DrawText( m_playerSprite, m_currentMessage.c_str(), m_currentMessage.size(), &m_playerFontRectDropShadow, DT_RIGHT|DT_TOP|DT_NOCLIP, m_playerFontColorDropShadow ); // text m_font->DrawText( m_playerSprite, m_currentMessage.c_str(), m_currentMessage.size(), &m_playerFontRect, DT_RIGHT|DT_TOP|DT_NOCLIP, m_playerFontColorMessage ) ); // control object m_sprite->Draw( m_texture, 0, 0, &m_vecPos, 0xFFFFFFFF ); // draws a few objects like this m_sprite->End() // end scene m_d3dDevice->EndScene(); What did I forget to do here? Except for the control objects (play button, pause button etc which are placed on a "panel" which is about 440 pixels wide) everything seems fine, the objects are positioned where I expect them, but just too big. By the way I loaded the images using D3DXCreateTextureFromFileEx (resizing wnidow, and reacting to lost device, etc, works fine too). For experimenting, I added some code to take an identity matrix and scale is down on the x/y axis to 0.75f, which then gave me the expected result for the controls (but also made the text smaller and out of position), but I don't know why I would need to scale anything. My rendering code is so simple, I just wanted to draw my 2D objects 1;1 the size they came from the file... I am really very inexperienced in D3D, so the answer might be very simple...

    Read the article

< Previous Page | 212 213 214 215 216 217 218 219 220 221 222 223  | Next Page >