Search Results

Search found 11735 results on 470 pages for 'global variables'.

Page 258/470 | < Previous Page | 254 255 256 257 258 259 260 261 262 263 264 265  | Next Page >

  • Installation procedure RAC One Node

    - by rene.kundersma
    Okay, In order to test RAC One Node, on my Oracle VM Laptop, I just: - installed Oracle VM 2.2 - Created two OEL 5.3 images The two images are fully prepared for Oracle 11gr2 Grid Infrastructure and 11gr2 RAC including four shared disks for ASM and private nics. After installation of the Oracle 11gr2 Grid Infrastructure and a "software only installation" of 11gr2 RAC, I installed patch 9004119 as you can see with the opatch lsinv output: This patch has the scripts required to administer RAC One Node, you will see them later. At the moment we have them available for Linux and Solaris. After installation of the patch, I created a RAC database with an instance on one node. Please note that the "Global Database Name" has to be the same as the SID prefix and should be less then or equal to 8 characters: When the database creation is done, first I create a service. This is because RAC One Node needs to be "initialized" each time you add a service: The service configuration details are: After creating the service, a script called raconeinit needs to run from $RDBMS_HOME/bin. This is a script supplied by the patch. I can imagine the next major patch set of 11gr2 has this scripts available by default. The script will configure the database to run on other nodes: After initialization, when you would run raconeinit again, you would see: So, now the configuration is ready and we are ready to run 'Omotion' and move the service around from one node to the other (yes, vm competitor: this is service is available during the migration, nice right ?) . Omotion is started by running Omotion. With Omotion -v you get verbose output: So, during the migration you will see the two instance active: And, after the migration, there is only one instance left on the new node:

    Read the article

  • CodePlex Daily Summary for Wednesday, May 05, 2010

    CodePlex Daily Summary for Wednesday, May 05, 2010New Projects2010微软精英大挑战Heritage of Dragon项目: 我们来自上海市同济大学,兴趣相投,集聚于此共同构建一个开放的网络平台。致力于运用构建在云端基于地图的服务,使用文字、图片、视频、互动动画等形式来展示全国各地的传统手工艺。并且充分发挥网络的优势,通过开放协作的维基平台人人都可以参与到内容的添加修改与完善中来。目的在于记录、展示、挖掘、传承中国古...AutoArchive: Auto archive your "my documents" to a remote machine. I'm writing this so my wife can put things in "my documents" and it'll automaticly archive i...BigDoor .NET Client: A .NET client for the BigDoor Media API. The API enables secured virtual transactions with support for any number of currencies, transactions, awar...bubujie: Dreamweaver LibraryGeckoGit: GeckoGit is a combination of TortoiseSVN and AnkhSVN, but for Git repositories, and built on the GitSharp library.Global: global, config, mail, http, rest, xml, serialization, helper, path, ioIndustrial Dashboard Connected Grid webpart: This Sharepoint 2007/10 webpart provides a simple way to display grid based reports populated with data that comes from a SQL Server stored procedu...IpControls: "IpControls" contains IPv4 and IPv6 text boxes, both as Windows Forms and WPF version. The IPv6 control automatically detects the older hybrid for...LiteME: LiteME is short for LiteMapleStoryEmulator... it is v75, open-source, and still going through it's alpha stages. It is still in development!Meditel PHP Class: Une classe PHP qui vous permet de d'envoyer des SMS vers tous les numeros Meditel en utilisant leservice des SMS gratuits depuis le site Meditel.maMoneySafe: Help people.Mouse Zoom - Visual Studio Extension: Mouse Zoom is a Visual Studio 2010 extension that will cause the mouse zoom functionality to zoom at the mouse's cursor instead of at the top of th...Multi-Language Words Memorizer: This .net application is designed for learning words and help foreign language learners by lots of automatic features. After you select a list of ...Navigation for ASP.NET Web Forms: Navigation for ASP.NET Web Forms manages movement and data passing between aspx Pages in a unit testable manner. There is no Client-side logic, so ...NazTek.Extension.Clr4: CLR 4.0 extensions and utility APIOpalis Community Releases: Sample workflows, objects, code and other items related to System Center's Opalis Integration Server, published by the Opalis team.Power Video Player: Power Video Player is a slim feature-rich video/dvd player that meets everyday needs in video playback on PC with a bunch of advanced features on b...SchemeEditor: <WPF> <.NET> <Editor> <Silverlight> <Scheme> <Graphics> <simulink> <schematic>StyleCop+: StyleCop+ is a plug-in that extends original StyleCop features.timemanager2010: Just another work time managerTweetTunes: Updates Twitter with current song playing in iTunes - if your Twitter account is linked to Facebook - it will update that too The twittervb2 down...WCF Discovery Library: WCF Discovery Library is a small collection of utilities that makes it easy to add WCF 4.0 Discovery features into your projects.New ReleasesAjaxControlToolkit additional extenders: ControlToolkitExtended: this build contains web example with BreadCrumbsAnyCAD: AnyCAD Free Beta1: AnyCAD Free Beta1Baccarat: Single player practice baccarat: This is a simple baccarat game for Windows Mobile. It is single player and is only a practice version, which will help users familiarize themselve...BigDoor .NET Client: BigDoor .NET 2.0 Client (Alpha): Our first iteration of the .NET client. Please fork and or ask to be added if you want to make any contributions.CBM-Command: 2010-05-04: Release NotesNew Features Panel navigation now complete. Scroll up and down through directories using the up and down cursor keys. Switch between...Directory Linker: Directory Linker 2.1: This release introduces XP support, more information about all features can be found at http://www.humblecoder.co.uk/?p=141Extend SmallBasic: Teaching Extensions v.015: added high low quizGoogle AJAX Search Services for jQuery: jquery.gss-0.1.3.js: First official release - use at your own discretion. Thanks, AndrewIndustrial Dashboard Connected Grid webpart: Filtered Industrial Grid: Filtered Industrial Grid web part for SharePoint 2007/2010, First Release.jQuery Library for SharePoint Web Services: SPServices 0.5.5: IMPORTANT NOTE: This release is in an alpha state. You should only download it if you know what you are getting and are interested in testing it f...Meditel PHP Class: Meditel PHP Class: Zipped File : Example : exemplemeditel.php PHP Class : meditel.class.phpMulti-Language Words Memorizer: Memorizer 1.0: First release.mwNSPECT: mwNSPECT Plugin DLL: mwNSPECT Mapwindow plugin dll. Place in your MapWindow or BASINS plugins directory. Presently only for testing form functionality (not including...mwNSPECT: mwNSPECT Simple Installer: Simplistic mwNSPECT Mapwindow plugin installer using Inno setup. Installs all the files you'll need for NSPECT into the C:\NSPECT folder and insta...MyWSAT - ASP.NET Membership Administration Tool: MyWSAT v3.5.3: MyWSAT 3.5.3 Update Notes - May 4th 2010 1.) Added the user search box and a-z navigation menu to all relevant user gridviews. 2.) Added a membersh...Object/Relational Mapper & Code Generator in Net 2.0 for Relational & XML Schema: 2.7: Upgraded UI-generation templates for special case of associative tables (2-column primary keys). Minor bugfix with template-editor.Open NFSe: Open NFSe 2.0: Versao para Belo Horizonte utilizando Windows Services.Power Video Player: PVP 1.1.3776: v1.1.3776 This is mainly a rebuild of version 1.1 under Ms-PL license and is the 1st version available at CodePlex.PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT: PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT-3.1: The following error has been corrected: PCG ERROR: srcproj -- 3933 PCG ERROR: srcproj -- 2943 PCG ERROR: devproj -- 1474 PCG ERROR: mainprj -- 128...Rehost Image: 1.3.9: Fixed locations saving for mac and linux platforms.Robot Shootans: Robot Shootans 0.5.1 (Windows): This is the first public release of this game. Instructions on how to play are included in the game itself Known issues: Changing control style wh...SchemeEditor: SchemeEditor Beta: First release. Wait for documentation & update for some new functionSharePoint Rsync List: SharePoint Rsync 0.9.0.0: Initial release of sprsync. Comments, questions, feedback, and code enhancements are welcome!Software Is Hardwork: Sw. Is Hw. Lib. 3.0.0.x+01: Sw. Is Hw. Lib. 3.0.0.x+01 UNSUPPORTED, UNTESTED ALPHA RELEASE Code may disappear. This is just a preview of code that was in progress. Code is s...Software Localization Tool: SharpSLT 1.0.1: Minor release: bug fixes slight changes in the UIStyleCop+: StyleCop+ 0.6: Several important improvements made for Advanced Naming Rules: - Added new entities for fields and constants - Added new entities for methods (incl...turing machine simulator: First version of turing machine: Overview: First version of turing simulator with example script (transaction function). Files: SimulatorGui.exe - main GUI of simulator TuringMach...VCC: Latest build, v2.1.30504.0: Automatic drop of latest buildVocabulary Training Center: Basic Edition 1.1: A release with medium large changes: New functionality: Multiple-choice questions added Grammatical questions added Evaluation changed accordin...Web Service Software Factory: Web Service Software Factory 2010 RC: To use the Web Service Software Factory 2010, you need the following software installed on your computer: • Microsoft Visual Studio 2010 (Ultima...Web Service Software Factory: WSSF2010 Guide: This is the help and guidance for Web Service Software Factory 2010Windows Phone 7 Panorama control: panorama control v0.6 + samples: IMPORTANT NOTE: Please read the following bug + suggested workaround. I'll fix this in a new release shortly. Panorama Control source code + sampl...WPF Behavior Library: WPF Behavior Library 0.2 Release: Drag & Drop Took away the ItemType and DataTemplate requirements Added functions for inheritors to be able to provide custom logic to handle movi...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight Toolkitpatterns & practices – Enterprise LibraryWindows Presentation Foundation (WPF)iTuner - The iTunes CompanionDotNetNuke® Community EditionASP.NETMost Active Projectspatterns & practices – Enterprise LibraryAJAX Control FrameworkHydroServer - CUAHSI Hydrologic Information System ServerIonics Isapi Rewrite Filterpatterns & practices: Azure Security GuidanceRawrBlogEngine.NETTinyProjectNB_Store - Free DotNetNuke Ecommerce Catalog ModuleAll-In-One Code Framework

    Read the article

  • Remastered King’s Quest Games Offer Classic Gaming on Modern Machines

    - by ETC
    If you were a fan of the King’s Quest series of adventure games from the 1980s and 1990s, you’ll be thrilled to see them remastered with enhanced graphics, polished voice acting, and more. Gaming studio AGD Interactive has been engaged in a multi-year project to remaster and freely release the classic King’s Quest games. They started with King’s Quest I in 2001, moved on to King’s Quest II, and just recently released King’s Quest III (see in the screenshot above). The gameplay for all three games is excellent and really captures the feel of the original games (with better graphics, quality voice acting, and adjustments made for modern computers). When you run the game you first get a small pre-launch console. There you can make adjusts to the game before launching it. We’d recommend you enable the pixel doubling filter. It make the game a little blockier (hey you are playing a 1980s remake!) but it fills up high resolution monitors more effectively. Hit up the link below to download King’s Quest Redux I-III free of charge for both Windows and Mac computers. King’s Quest Redux [AGD Interactive via @cbeust] Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Manage Your Favorite Social Accounts in Chrome and Iron with Seesmic E.T. II – Extinction [Fake Movie Sequel Video] Remastered King’s Quest Games Offer Classic Gaming on Modern Machines Compare Your Internet Cost and Speed to Global Averages [Infographic] Orbital Battle for Terra Wallpaper WizMouse Enables Mouse Over Scrolling on Any Window

    Read the article

  • LLBLGen Pro and JSON serialization

    - by FransBouma
    I accidentally removed a reply from my previous blogpost, and as this blog-engine here at weblogs.asp.net is apparently falling apart, I can't re-add it as it thought it would be wise to disable comment controls on all posts, except new ones. So I'll post the reply here as a quote and reply on it. 'Steven' asks: What would the future be for LLBLGen Pro to support JSON for serialization? Would it be worth the effort for a LLBLGenPro user to bother creating some code templates to produce additional JSON serializable classes? Or just create some basic POCO classes which could be used for exchange of client/server data and use DTO to map these back to LLBGenPro ones? If I understand the work around, it is at the expense of losing xml serialization. Well, as described in the previous post, to enable JSON serialization, you can do that with a couple of lines and some attribute assignments. However, indeed, the attributes might make the XML serialization not working, as described in the previous blogpost. This is the case if the service you're using serializes objects using the DataContract serializer: this serializer will give up to serialize the entity objects to XML as the entity objects implement IXmlSerializable and this is a no-go area for the DataContract serializer. However, if your service doesn't use a DataContract serializer, or you serialize the objects manually to Xml using an xml serializer, you're fine. When you want to switch to Xml serializing again, instead of JSON in WebApi, and you have decorated the entity classes with the data-contract attributes, you can switch off the DataContract serializer, by setting a global configuration setting: var xml = GlobalConfiguration.Configuration.Formatters.XmlFormatter; xml.UseXmlSerializer = true; This will make the WebApi use the XmlSerializer, and run the normal IXmlSerializable interface implementation.

    Read the article

  • Testing Routes in ASP.NET MVC with MvcContrib

    - by Guilherme Cardoso
    I've decide to write about unit testing in the next weeks. If we decide to develop with Test-Driven Developement pattern, it's important to not forget the routes. This article shows how to test routes. I'm importing my routes from my RegisterRoutes method from the Global.asax of Project.Web created by default (in SetUp). I'm using ShouldMapTp() from MvcContrib: http://mvccontrib.codeplex.com/ The controller is specified in the ShouldMapTo() signature, and we use lambda expressions for the action and parameters that are passed to that controller. [SetUp] public void Setup() { Project.Web.MvcApplication.RegisterRoutes(RouteTable.Routes); } [Test] public void Should_Route_HomeController() { "~/Home" .ShouldMapTo<HomeController>(action => action.Index()); } [Test] public void Should_Route_EventsController() { "~/Events" .ShouldMapTo<EventsController>(action => action.Index()); "~/Events/View/44/Concert-DevaMatri-22-January-" .ShouldMapTo<EventosController>(action => action.Read(1, "Title")); // In this example,44 is the Id for my Event and "Concert-DevaMatri-22-January" is the title for that Event } [TearDown] public void teardown() { RouteTable.Routes.Clear(); }

    Read the article

  • Oracle’s AutoVue Enables Visual Decision Making

    - by Pam Petropoulos
    That old saying about a picture being worth a thousand words has never been truer.  Check out the latest reports from IDC Manufacturing Insights which highlight the importance of incorporating visual information in all facets of decision making and the role that Oracle’s AutoVue Enterprise Visualization solutions can play. Take a look at the excerpts below and be sure to click on the titles to read the full reports. Technology Spotlight: Optimizing the Product Life Cycle Through Visual Decision Making, August 2012 Manufacturers find it increasingly challenging to make effective product-related decisions as the result of expanded technical complexities, elongated supply chains, and a shortage of experienced workers. These factors challenge the traditional methodologies companies use to make critical decisions. However, companies can improve decision making by the use of visual decision making, which synthesizes information from multiple sources into highly usable visual context and integrates it with existing enterprise applications such as PLM and ERP systems. Product-related information presented in a visual form and shared across communities of practice with diverse roles, backgrounds, and job skills helps level the playing field for collaboration across business functions, technologies, and enterprises. Visual decision making can contribute to manufacturers making more effective product-related decisions throughout the complete product life cycle. This Technology Spotlight examines these trends and the role that Oracle's AutoVue and its Augmented Business Visualization (ABV) solution play in this strategic market. Analyst Connection: Using Visual Decision Making to Optimize Manufacturing Design and Development, September 2012 In today's environments, global manufacturers are managing a broad range of information. Data is often scattered across countless files throughout the product life cycle, generated by different applications and platforms. Organizations are struggling to utilize these multidisciplinary sources in an optimal way. Visual decision making is a strategy and technology that can address this challenge by integrating and widening access to digital information assets. Integrating with PLM and ERP tools across engineering, manufacturing, sales, and marketing, visual decision making makes digital content more accessible to employees and partners in the supply chain. The use of visual decision-making information rendered in the appropriate business context and shared across functional teams contributes to more effective product-related decision making and positively impacts business performance.

    Read the article

  • StackCenter 2 - Now in Public Beta!

    - by George Edison
    Visit now: http://stackcenter.quickmediasolutions.com/beta/ Feedback is appreciated! About StackCenter 2 has been brewing for quite some time now. Since the global inbox was introduced, the original StackCenter was rendered mostly useless and the need for a replacement was born. And now, I present StackCenter 2! Its goal is to be a dashboard for everything StackExchange such as rep. graphs, images, or whatever! Currently, there are 3 widgets and it is now possible to write your own - just follow the link on your dashboard's home page. After registering, simply click on the 'add widgets...' link to get started. Some things might not work quite right. (This is in beta after all.) Any feedback you can provide is welcome! License Closed source at this point. Platform StackCenter should run fine on any web browser that has JavaScript enabled. (StackCenter 2 uses a lot of JavaScript.) Contact Email: [email protected] Code PHP (using the CakePHP framework), JavaScript, and of course, HTML

    Read the article

  • Creando controles personalizados para asp.net

    - by jaullo
    Si bien es cierto que asp.net contiene muchos controles que nos facilitan la vida, en muchas ocasiones requerimos funcionalidades adicionales. Una de las opciones es recurrir a la creación de controles personalizados. Este será el Primero de varios post que dedicare a mostrar como crear algunos controles personalizados utilizando elementos sumamente sencillos y faciles de entender. Para ello utilizaremos unicamente los regularexpressionvalidator y unas cuantas expresiones regulares. Para este ejemplo extenderemos la funcionalidad de un textbox para que valide números de tarjetas de crédito. Nuestro textbox deberá verificar que existan 16 números, en grupos de 4, separados por un - Entonces, creamos un nuevo proyecto de tipo control de servidor asp.net Primeramente importamos los espacios de nombres Imports System.ComponentModel Imports System.Web Imports System.Web.UI.WebControls Imports System.Web.UI   Segundo creamos nuestra clase Public Class TextboxCreditCardNumber end class Ahora,  le decimos a nuestra clase que vamos a heredar de textbox Public Class TextboxCreditCardNumber           Inherits TextBox end class Una vez que tenemos esto, nuestra base de programación esta lista, asi que vamos a codificar nuestra nueva funcionalidad Declaramos nuestra variables y una propiedad pública que contendrá el mensaje de error que debe ser devuelto al usuario, esta será publica para que pueda ser personalizada.    Private req As New RegularExpressionValidator     Private mstrmensaje As String = "Número de Tarjeta Invalido"     Public Property MensajeError() As String         Get             Return mstrmensaje         End Get         Set(ByVal value As String)             mstrmensaje = value         End Set     End Property   Ahora definimos el metodo OnInit de nuestro control, en el cual asignaremos las propiedad e inicializaremos nuestras funciones    Protected Overrides Sub OnInit(ByVal e As System.EventArgs)         req.ControlToValidate = MyBase.ID         req.ErrorMessage = mstrmensaje         req.Display = ValidatorDisplay.Dynamic         req.ValidationExpression = "^(\d{4}-){3}\d{4}$|^(\d{4} ){3}\d{4}$|^\d{16}$"         Controls.Add(New LiteralControl("&nbsp;"))         Controls.Add(req)         MyBase.OnInit(e)     End Sub   Y por último, definimos el evento render (que es el encarado de dibujar nuestro control) Protected Overrides Sub Render(ByVal writer As System.Web.UI.HtmlTextWriter)         MyBase.Render(writer)         req.RenderControl(writer)     End Sub   Lo unico que nos queda ahora es compilar nuestra clase y añadir nuestro nuevo control al ToolBox de Controles para que pueda ser utilizado.

    Read the article

  • Watch IPL League 2010 Online On YouTube

    - by Suganya
    Having said that IPL League match for the year 2010 starts tomorrow, Many of us would be interested in watching the broadcast live online. The first match between Deccan chargers and Kolkata Knight Riders starts tomorrow at 20.00 IST. IPL T20 lasts for 45 days starting from March 12th 2010 till April 25th 2010. The entire league takes place in India. The opening ceremony takes place on 12th continued by the official game between Deccan Chargers and Kolkata Knight Riders Most of us would not be able to watch the match on Television. So this year IPL joins hand with Google to make it available live on NET. How to Watch IPL Match Live Online Google and IPL has joined their hands to make the match available online for all the viewers around the world. To watch the IPL Match live online log on here How to Watch IPL Match Live On Mobile IPL and GCV(Global Cricket Ventures) are tied to July Systems. You can watch IPL matches live on mobile by accessing the link directly using a GPRS enabled mobile or viewers can simply call from their mobile phone to MiX (Mobile Internet Experience) using the Toll Free number 08 123 123 123. Once you call this Toll Free number a link will be sent to your mobile. You can access that link to view the live cricket on your mobile. Watch IPL League Cricket Match Live Online Watch IPL League Cricket Match Live On Mobile Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • JMS Adapter Step 0 : Configuring the WLS-JMS resources

    - by [email protected]
    Before getting started with the JMS Adapter, we must configure the connection factories/JMS queues on the WLS admin console. In particular, we will be required to follow these stepsCreate a connection factory. In our case, we will create a "XA Connection Factory". This step is mandatory if you need your JMS queues to participate in a global transaction. Create the WLS JMS QueuesCreating the connection factory:1) Login to the WLS Admin console. On my setup, the url looks like "http://localhost:7001/console".2) Select Services -> Messaging -> JMS Modules -> SOAJMSModule as shown below. We can also create a new JMS Module, but, I took the easier way out by selecting the SOAJMSModule. 3) Click on "New" as shown in order to create the Connection factory.4) Select "Connection Factory" radio button and click "Next".5) Enter the Connection Factory properties as shown and click on "Finish".6) Target the connection factory to your managed server and click on "Finish". 7) Now, go back and select the Connection Factory that you've just created (see Step 2 above) . Click on "Transactions" and enable XA and click on "Save".

    Read the article

  • TFS 2010 Server Name Change

    - by PearlFactory
    So I thought I would  change the name of my machine so that the other devs can find the TFS server easily. TFS 2005 would use the cool cmd line util tfsadminutil.....alas he is now gone HERE Are the steps to complete Edit the web.config and is usually located on default install C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\web.config <add key="applicationDatabase" value="Data Source=JUSTIN\SQLI01;Initial Catalog=Tfs_Configuration;Integrated Security=True;" /> Next step is to edit previous Solutions/Projects 1) Open the Solution file i.e ProductApp.sln 2) Edit the SccTeamFoundationServer URL under Global section i.e Change this to new name   If you have DB server on same machine ...you will need to go in and remove existing db user account assigned to the tfs DB Remove old [%machine_name%] value i.e Tuned_Dev_PC_12\Justin user from the above DBs No add the new Justin\Justin user account associated with the new machine name to the TFS & Reporing dbs ... dbo or the TFSADMIN & TFSEXEC roles either will do in this case. (or add both ) Now either ReApply user or add New account (remove old account i.e Tuned_Dev_PC_12\justin) If DB permisions are setup correctyly you will get a screen that looks like this   If it pauses or gets stuck you need to look back at the adding correct DB Perms to the i.e JUSTIN\Justin user account Also if your project is still complaining about old TFS name 1) Team\Connect new Team Foundation Server 2) Add\Remove TFS 3) Add New TFS Name  Once you have connected to the new TFS server Reload your project from TFS..this way it removes a lot of the bugs that hang around in the local project\solution This is similar to a VSS2005 and older fix Cheers ( eta about 60-90 mins so weigh up the the need vs payoff. ) Shutdown restart

    Read the article

  • New Demos SOA Suite (11.1.1.6) & SOA Suite Foundation Pack (11.1.1.6)

    - by JuergenKress
    For access to the Oracle demo systems please visit OPN and talk to your Partner Expert GSE: SOA & FP (11.1.1.6) Platforms Portable Version – Available SOA 11g Platform FP 11g Platform All SOA/BPM 11g Solutions OFM Demos Corner GSE Offerings Scheduling Demos on GSE Support GSE is pleased to announce the availability of SOA and Foundation Pack 11g (11.1.1.6) Platform Portable images. Portable images now come as a VBox appliance. SOA 11.1.1.6 Platform Portable Version This portable image comes with latest SOA Suite products installed and configured. Vbox appliance facilitates easy maintenance of the image. Click here to download the portable image. FP 11.1.1.6 Platform Portable Version Foundation Pack installed and configured on SOA image and stands as a base for building cross-application integrations. Click here to download the portable image. In addition to Portable images, Global Sales Engineering would like to inform availability of Hosted version of SOA & BPM 11g (11.1.1.6) Solutions. Click here for more information. SOA Suite Foundation Pack Demo Demo Overview Business Process Artifacts Demo Architecture Bill of Materials Demo Collateral DSS Offerings OFM Demos Corner Scheduling Demos on DSS DSS Support The Foundation Pack(FP) demo showcases various tools and utilities of Foundation Pack like Project Lifecycle Workbench(PLW) JDeveloper - Service Constructor Harvesting services to PLW/ Oracle Enterprise Repository Generation of Bill of Materials (BOM) Creation of Deployment Plans / Harvestor Settings Track Foundation Pack Fusion Order demo flow in Enterprise Manager Console For more information on the demo click here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA DEmo System,DSS,SOA,sales,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Microsoft MVP 2012 – ASP.NET/IIS

    - by hajan
    It’s Sunday. I wasn’t really sure whether I should expect something today or not, although its 1st of July when we all know that the new and re-awarded MVPs should get the ‘Congratulations’ email by Microsoft. And YES! I GOT IT! This is my second year, and first time re-awarded… Microsoft MVP 2012 The feeling is exactly same as the first time… I am honored, privileged, veeeery happy and thankful to Microsoft for this prestigious award! The past year was really great with all the events, speaking engagements in various conferences and camps, many other community activities and the first time visit at MVP Global Summit. I am looking forward to boost even more the Microsoft community activities in the next year... And… part of the email message: Dear Hajan Selmani, Congratulations! We are pleased to present you with the 2012 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in ASP.NET/IIS technical communities during the past year. I would like to say a big THANK YOU to all stakeholders. First of all, THANK YOU MICROSOFT for this prestigious award, Thanks to CEE & Italy Region MVP Lead, Alessandro Teglia, who did a great job by helping and supporting MVPs through the whole past year, I hope we will continue collaborating in the same way on the forthcoming year! Thanks to my family, friends, supports, followers, those who read my blogs regularly and have made me reach more than thousands of comments in my ASP.NET Blog :), those who collaborate and work with me on a daily basis and are supporting me in all my community activities. Thank You Everyone! There are lot of new, exciting, great and innovative technologies in the Microsoft Technology Stack. I am excited and really looking forward to rock the community in the years to come! THANK YOU! Hajan

    Read the article

  • E.T. II – Extinction [Fake Movie Sequel Video]

    - by Asian Angel
    It has been years since E.T. returned home and Elliot is all grown up now. But things are about to get a lot more interesting when E.T. returns to help save the Earth. The only problem is that the invaders are the people from E.T.’s own planet. Forget cute and cuddly…extinction is the name of the game in this well edited fake sequel trailer from YouTube user Robert Blankenheim. Warning: If you have fond, warm memories of the original E.T. movie, then you may want to skip this video. These extra-terrestrials are literally portrayed as pure, bloodthirsy evil. ET Sequel: “ET-X” (Extended Trailer) [via Geeks are Sexy] Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Manage Your Favorite Social Accounts in Chrome and Iron with Seesmic E.T. II – Extinction [Fake Movie Sequel Video] Remastered King’s Quest Games Offer Classic Gaming on Modern Machines Compare Your Internet Cost and Speed to Global Averages [Infographic] Orbital Battle for Terra Wallpaper WizMouse Enables Mouse Over Scrolling on Any Window

    Read the article

  • ODI 11g – Expert Accelerator for Model Creation

    - by David Allan
    Following on from my post earlier this morning on scripting model and topology creation tonight I thought I’d add a little UI to make those groovy functions a little more palatable. In OWB we have experts for capturing user input, with the groovy console we open up opportunities to build UI around the scripts in a very easy way – even I can do it;-) After a little googling around I found some useful posts on SwingBuilder, the most useful one that I used for the dialog below was this one here. This dialog captures user input for the technology and context for the model and logical schema etc to be created. You can see there are a variety of interesting controls, and its really easy to do. The dialog captures the users input, then when OK is pressed I call the functions from the earlier post to create the logical schema (plus all the other objects) and model. The image below shows what was created, you can see the model (with typo in name), the model is Oracle technology and references the logical schema ORACLE_SCOTT (that I named in dialog above), the logical schema is mapped via the GLOBAL context to the data server ORACLE_SCOTT_DEV (that I named in dialog above), and the physical schema used was just the user name that I connected with – so if you wanted a different user the schema name could be added to the dialog. In a nutshell, one dialog that encapsulates a simpler mechanism for creating a model. You can create your own scripts that use dialogs like this, capture input and process. You can find the groovy script for this is here odi_create_model.groovy, again I wrapped the user capture code in a groovy function and return the result in a variable and then simply call the createLogicalSchema and createModel functions from the previous posting. The script I supplied above has everything you will need. To execute use Tools->Groovy->Open Script and then execute the green play button on the toolbar. Have fun.

    Read the article

  • Why does Ubuntu reset brightness settings at the loading screen?

    - by leugim
    Since I first installed Ubuntu 11.10, I noticed that volume and screen brightness get reset every time Ubuntu starts. Why is this so? And what ways are there to keep brightness and volume levels after rebooting? I have found some scripts that change the screen-brightness at login. But this is not a good solution since login is slower because it seems to wait until the screen brightness is at the level specified by the script. After entering the password I see the screen brightness go down gradually. Only after this is complete (~1 or 2 seconds) does the background disappear and Unity come up. The screenbrightness is not remembered but instead redefined at login. So it gets remembered for the first part of the boot, then set to MAX and then again re-set to normal value by the script. My boot process is as follows: desired brightness: 2 (13,33%) / Max brightness: 15 (100%) Bios / brightness: OK GRUB (violet background color, white text) / brightness: OK Ubuntu loading screen with the dots / brightness: MAX (win7 loads with OK-brightness) User Login / brightness: MAX Unity starts / brightness: OK It seems to be more like a temporary patch than a actual solution. I'm looking for solutions that set the desired brightness permanently and consistently throughout the whole boot-process After updating to 12.04 the behavior is the same. I tried setpci -s 02:00.0 F4.B=XX The value of F4.B is always '0' regardless of what value I try to set it to (tried 0, ff, f, 5, etc) The solution in this answer does not have any noticeable effect: Desktop doesn't remember brightness settings after a reboot The variables at /sys/class/backlight/acpi_video0/ get changed if I use Fn+UP and Fn+DOWN Any help is appreciated. Thanks!

    Read the article

  • Deferred vs Immediate execution in Linq

    - by Jalpesh P. Vadgama
    In this post, We are going to learn about Deferred vs Immediate execution in Linq.  There an interesting variations how Linq operators executes and in this post we are going to learn both Deferred execution and immediate execution. What is Deferred Execution? In the Deferred execution query will be executed and evaluated at the time of query variables usage. Let’s take an example to understand Deferred Execution better. Example: Following is a code for that. using System; using System.Collections.Generic; using System.Linq; namespace ConsoleApplication1 { class Program { static void Main(string[] args) { var customers = new List<Customer>( new[] { new Customer{FirstName = "Jalpesh",LastName = "Vadgama"}, new Customer{FirstName = "Vishal",LastName = "Vadgama"}, new Customer{FirstName = "Tushar",LastName = "Maru"} } ); var newCustomers = customers.Where(c => c.LastName == "Vadgama"); customers.Add(new Customer {FirstName = "Teerth", LastName = "Vadgama"}); foreach (var c in newCustomers) { Console.WriteLine(c.FirstName); } } public class Customer { public string FirstName { get; set; } public string LastName { get; set; } } } } More on my personal blog @www.dotnetjalps.com

    Read the article

  • Ventajas y Beneficios de migrar a las últimas versiones de JD Edwards

    - by Ramon Riera
    La semana pasada realicé un webinar acerca de las Ventajas y Beneficios de migrar a las últimas versiones de JD Edwards EnterpriseOne. El objetivo no era contar en detalle todas las mejoras de las últimas versiones ya que son muchas y variables según la versión actual; si no resaltar algunos de los motivos generales por los que os valdría la pena migrar. En el video repaso la historia de JD Edwards y la estrategia y compromiso que Oracle tiene con el producto como se refleja en las políticas de Applications Unlimited, Lifetime Support y el roadmap del producto. También las principales mejoras que ha habido, especialmente centrándome en los 3 principales ejes de mejora que ha tenido y tiene JD Edwards: · El funcional con los nuevos módulos de Apparel Management, Fulfillment Managment y Environmental Accounting · Las mejoras que aporta la tecnología de Oracle, haciendo un especial énfasis en las mejoras en usabilidad de pantallas y uso del iPad · Las integraciones con el resto de aplicaciones de Oracle, como por ejemplo las aplicaciones analíticas de BI. Finalmente comento las facilidades que desde Oracle damos a los clientes para migrar y que se pueden resumir en la página web www.upgradejde.com dónde ponemos a disposición toda la información necesaria para planificar una migración como: Situación actual de mantenimiento de cada versión Productos y mejoras de cada versión Comparativas por versión tanto a nivel funcional como a nivel de cambios en la base de datos Consejos de migración para evaluar, planificar y ejecutar un proyecto de migración Toda la documentación de las versiones A continuación os djo el video:

    Read the article

  • My New BDD Style

    - by Liam McLennan
    I have made a change to my code-based BDD style. I start with a scenario such as: Pre-Editing * Given I am a book editor * And some chapters are locked and some are not * When I view the list of chapters for editing * Then I should see some chapters are editable and are not locked * And I should see some chapters are not editable and are locked and I implement it using a modified SpecUnit base class as: [Concern("Chapter Editing")] public class when_pre_editing_a_chapter : BaseSpec { private User i; // other context variables protected override void Given() { i_am_a_book_editor(); some_chapters_are_locked_and_some_are_not(); } protected override void Do() { i_view_the_list_of_chapters_for_editing(); } private void i_am_a_book_editor() { i = new UserBuilder().WithUsername("me").WithRole(UserRole.BookEditor).Build(); } private void some_chapters_are_locked_and_some_are_not() { } private void i_view_the_list_of_chapters_for_editing() { } [Observation] public void should_see_some_chapters_are_editable_and_are_not_locked() { } [Observation] public void should_see_some_chapters_are_not_editable_and_are_locked() { } } and the output from the specunit report tool is: Chapter Editing specifications    1 context, 2 specifications Chapter Editing, when pre editing a chapter    2 specifications should see some chapters are editable and are not locked should see some chapters are not editable and are locked The intent is to provide a clear mapping from story –> scenarios –> bdd tests.

    Read the article

  • Refactor This (Ugly Code)!

    - by Alois Kraus
    Ayende has put on his blog some ugly code to refactor. First and foremost it is nearly impossible to reason about other peoples code without knowing the driving forces behind the current code. It is certainly possible to make it much cleaner when potential sources of errors cannot happen in the first place due to good design. I can see what the intention of the code is but I do not know about every brittle detail if I am allowed to reorder things here and there to simplify things. So I decided to make it much simpler by identifying the different responsibilities of the methods and encapsulate it in different classes. The code we need to refactor seems to deal with a handler after a message has been sent to a message queue. The handler does complete the current transaction if there is any and does handle any errors happening there. If during the the completion of the transaction errors occur the transaction is at least disposed. We can enter the handler already in a faulty state where we try to deliver the complete event in any case and signal a failure event and try to resend the message again to the queue if it was not inside a transaction. All is decorated with many try/catch blocks, duplicated code and some state variables to route the program flow. It is hard to understand and difficult to reason about. In other words: This code is a mess and could be written by me if I was under pressure. Here comes to code we want to refactor:         private void HandleMessageCompletion(                                      Message message,                                      TransactionScope tx,                                      OpenedQueue messageQueue,                                      Exception exception,                                      Action<CurrentMessageInformation, Exception> messageCompleted,                                      Action<CurrentMessageInformation> beforeTransactionCommit)         {             var txDisposed = false;             if (exception == null)             {                 try                 {                     if (tx != null)                     {                         if (beforeTransactionCommit != null)                             beforeTransactionCommit(currentMessageInformation);                         tx.Complete();                         tx.Dispose();                         txDisposed = true;                     }                     try                     {                         if (messageCompleted != null)                             messageCompleted(currentMessageInformation, exception);                     }                     catch (Exception e)                     {                         Trace.TraceError("An error occured when raising the MessageCompleted event, the error will NOT affect the message processing"+ e);                     }                     return;                 }                 catch (Exception e)                 {                     Trace.TraceWarning("Failed to complete transaction, moving to error mode"+ e);                     exception = e;                 }             }             try             {                 if (txDisposed == false && tx != null)                 {                     Trace.TraceWarning("Disposing transaction in error mode");                     tx.Dispose();                 }             }             catch (Exception e)             {                 Trace.TraceWarning("Failed to dispose of transaction in error mode."+ e);             }             if (message == null)                 return;                 try             {                 if (messageCompleted != null)                     messageCompleted(currentMessageInformation, exception);             }             catch (Exception e)             {                 Trace.TraceError("An error occured when raising the MessageCompleted event, the error will NOT affect the message processing"+ e);             }               try             {                 var copy = MessageProcessingFailure;                 if (copy != null)                     copy(currentMessageInformation, exception);             }             catch (Exception moduleException)             {                 Trace.TraceError("Module failed to process message failure: " + exception.Message+                                              moduleException);             }               if (messageQueue.IsTransactional == false)// put the item back in the queue             {                 messageQueue.Send(message);             }         }     You can see quite some processing and handling going on there. Yes this looks like real world code one did put together to make things work and he does not trust his callbacks. I guess these are event handlers which are optional and the delegates were extracted from an event to call them back later when necessary.  Lets see what the author of this code did intend:          private void HandleMessageCompletion(             TransactionHandler transactionHandler,             MessageCompletionHandler handler,             CurrentMessageInformation messageInfo,             ErrorCollector errors             )         {               // commit current pending transaction             transactionHandler.CallHandlerAndCommit(messageInfo, errors);               // We have an error for a null message do not send completion event             if (messageInfo.CurrentMessage == null)                 return;               // Send completion event in any case regardless of errors             handler.OnMessageCompleted(messageInfo, errors);               // put message back if queue is not transactional             transactionHandler.ResendMessageOnError(messageInfo.CurrentMessage, errors);         }   I did not bother to write the intention here again since the code should be pretty self explaining by now. I have used comments to explain the still nontrivial procedure step by step revealing the real intention about all this complex program flow. The original complexity of the problem domain does not go away but by applying the techniques of SRP (Single Responsibility Principle) and some functional style but we can abstract the necessary complexity away in useful abstractions which make it much easier to reason about it. Since most of the method seems to deal with errors I thought it was a good idea to encapsulate the error state of our current message in an ErrorCollector object which stores all exceptions in a list along with a description what the error all was about in the exception itself. We can log it later or not depending on the log level or whatever. It is really just a simple list that encapsulates the current error state.          class ErrorCollector          {              List<Exception> _Errors = new List<Exception>();                public void Add(Exception ex, string description)              {                  ex.Data["Description"] = description;                  _Errors.Add(ex);              }                public Exception Last              {                  get                  {                      return _Errors.LastOrDefault();                  }              }                public bool HasError              {                  get                  {                      return _Errors.Count > 0;                  }              }          }   Since the error state is global we have two choices to store a reference in the other helper objects (TransactionHandler and MessageCompletionHandler)or pass it to the method calls when necessary. I did chose the latter one because a second argument does not hurt and makes it easier to reason about the overall state while the helper objects remain stateless and immutable which makes the helper objects much easier to understand and as a bonus thread safe as well. This does not mean that the stored member variables are stateless or thread safe as well but at least our helper classes are it. Most of the complexity is located the transaction handling I consider as a separate responsibility that I delegate to the TransactionHandler which does nothing if there is no transaction or Call the Before Commit Handler Commit Transaction Dispose Transaction if commit did throw In fact it has a second responsibility to resend the message if the transaction did fail. I did see a good fit there since it deals with transaction failures.          class TransactionHandler          {              TransactionScope _Tx;              Action<CurrentMessageInformation> _BeforeCommit;              OpenedQueue _MessageQueue;                public TransactionHandler(TransactionScope tx, Action<CurrentMessageInformation> beforeCommit, OpenedQueue messageQueue)              {                  _Tx = tx;                  _BeforeCommit = beforeCommit;                  _MessageQueue = messageQueue;              }                public void CallHandlerAndCommit(CurrentMessageInformation currentMessageInfo, ErrorCollector errors)              {                  if (_Tx != null && !errors.HasError)                  {                      try                      {                          if (_BeforeCommit != null)                          {                              _BeforeCommit(currentMessageInfo);                          }                            _Tx.Complete();                          _Tx.Dispose();                      }                      catch (Exception ex)                      {                          errors.Add(ex, "Failed to complete transaction, moving to error mode");                          Trace.TraceWarning("Disposing transaction in error mode");                          try                          {                              _Tx.Dispose();                          }                          catch (Exception ex2)                          {                              errors.Add(ex2, "Failed to dispose of transaction in error mode.");                          }                      }                  }              }                public void ResendMessageOnError(Message message, ErrorCollector errors)              {                  if (errors.HasError && !_MessageQueue.IsTransactional)                  {                      _MessageQueue.Send(message);                  }              }          } If we need to change the handling in the future we have a much easier time to reason about our application flow than before. After we did complete our transaction and called our callback we can call the completion handler which is the main purpose of the HandleMessageCompletion method after all. The responsiblity o the MessageCompletionHandler is to call the completion callback and the failure callback when some error has occurred.            class MessageCompletionHandler          {              Action<CurrentMessageInformation, Exception> _MessageCompletedHandler;              Action<CurrentMessageInformation, Exception> _MessageProcessingFailure;                public MessageCompletionHandler(Action<CurrentMessageInformation, Exception> messageCompletedHandler,                                              Action<CurrentMessageInformation, Exception> messageProcessingFailure)              {                  _MessageCompletedHandler = messageCompletedHandler;                  _MessageProcessingFailure = messageProcessingFailure;              }                  public void OnMessageCompleted(CurrentMessageInformation currentMessageInfo, ErrorCollector errors)              {                  try                  {                      if (_MessageCompletedHandler != null)                      {                          _MessageCompletedHandler(currentMessageInfo, errors.Last);                      }                  }                  catch (Exception ex)                  {                      errors.Add(ex, "An error occured when raising the MessageCompleted event, the error will NOT affect the message processing");                  }                    if (errors.HasError)                  {                      SignalFailedMessage(currentMessageInfo, errors);                  }              }                void SignalFailedMessage(CurrentMessageInformation currentMessageInfo, ErrorCollector errors)              {                  try                  {                      if (_MessageProcessingFailure != null)                          _MessageProcessingFailure(currentMessageInfo, errors.Last);                  }                  catch (Exception moduleException)                  {                      errors.Add(moduleException, "Module failed to process message failure");                  }              }            }   If for some reason I did screw up the logic and we need to call the completion handler from our Transaction handler we can simple add to the CallHandlerAndCommit method a third argument to the MessageCompletionHandler and we are fine again. If the logic becomes even more complex and we need to ensure that the completed event is triggered only once we have now one place the completion handler to capture the state. During this refactoring I simple put things together that belong together and came up with useful abstractions. If you look at the original argument list of the HandleMessageCompletion method I have put many things together:   Original Arguments New Arguments Encapsulate Message message CurrentMessageInformation messageInfo         Message message TransactionScope tx Action<CurrentMessageInformation> beforeTransactionCommit OpenedQueue messageQueue TransactionHandler transactionHandler        TransactionScope tx        OpenedQueue messageQueue        Action<CurrentMessageInformation> beforeTransactionCommit Exception exception,             ErrorCollector errors Action<CurrentMessageInformation, Exception> messageCompleted MessageCompletionHandler handler          Action<CurrentMessageInformation, Exception> messageCompleted          Action<CurrentMessageInformation, Exception> messageProcessingFailure The reason is simple: Put the things that have relationships together and you will find nearly automatically useful abstractions. I hope this makes sense to you. If you see a way to make it even more simple you can show Ayende your improved version as well.

    Read the article

  • OTN Virtual Developer Day for WebLogic Server and WebLogic Developer Broadcasts

    - by mike.lehmann
    To further move the new year of 2011 underway for WebLogic Server, quite a series of hands on technical online events and broadcasts are about to get underway from the WebLogic team. The first is Virtual Developer Day: Oracle WebLogic Server which is an online event that combines hands on labs with WebLogic Server through a series of Virtual Box images. This event will cover things like the new Java EE 6 capabilities one can use on WebLogic Server, using Maven and Hudson with WebLogic Server, developing with Web services on WebLogic Server and even upgrading from Oracle Application Server. Very technical, very hands on. And its global - multiple geographies covered.  Nice! James Bayer has put out a full agenda for this on his blog as well as links on how to register. The second is a 5 week long weekly technical broadcast under the umbrella of Accelerate Your Development with Oracle WebLogic Suite walking through topics like working with JPA, designing distributed caching strategies with WebLogic Server, advanced JMS topics and UI topics like JQuery as well restful Web services with Jersey and JAX-RS.  Again in James' blog the full agenda is available to check out if it is interesting for you to attend including a brief video introduction outlining in a bit more detail exactly what will be covered. Hopefully between these two events and the release of WebLogic Server 10.3.4 earlier in January, we are kicking off 2011 in a good fashion.  Looking forward to sharing more as we go forward in 2011.

    Read the article

  • San Joaquin County, California Wins AIIM 2012 Carl E. Nelson Best Practice Award

    - by Peggy Chen
    Last month, AIIM, the global community of information professionals, announced the winners of the 2012 Carl E. Nelson Best Practices Awards. And San Joaquin County, California won in the small company category for 1-100 employees. The Carl E. Nelson Best Practices Award was established to recognize excellence in the area of information management. "Best practice" denotes a standard of excellence that has been achieved with an organization and refers to a process that can be quantified, adapted and repeated. Like many counties, San Joaquin County, California, was faced with huge challenges due to decreasing funds and staff, including decreased cost of building capability. It needed to streamline processes, cut costs per activity, modernize and strengthen the infrastructure, and adopt new technology and standards such as the National Information Exchange Model (NIEM). The Integrated Justice Information System (IJIS) provides a Web-based system to link more than 650,000 residents, 18 agencies countywide and other law enforcement systems nationwide. The county’s modernization initiative focused on replacing its outdated warrant system, implementing service-oriented architecture (SOA) to simplify integration between county law and justice systems, deploying Business Process Management (BPM), Case Management with content management, and Web technologies from Oracle. A critical part of their success has been the proper alignment of our Strategic Vision to the way the organization was enabled to plan and execute (and continues to execute) their modernization project. Congratulations to San Joaquin County!

    Read the article

  • Debug.Assert replacement for Phone and Store apps

    - by Daniel Moth
    I don’t know about you, but all my code is, and always has been, littered with Debug.Assert statements. I think it all started way back in my (short-lived, but impactful to me) Eiffel days, when I was applying Design by Contract. Anyway, I can’t live without Debug.Assert. Imagine my dismay when I upgraded my Windows Phone 7.x app (Translator By Moth) to Windows Phone 8 and discovered that my Debug.Assert statements would not display anything on the target and would not break in the debugger any longer! Luckily, the solution was simple and in this post I share it with you – feel free to teak it to meet your needs. Steps to use Add a new code file to your project, delete all its contents, and paste in the code from MyDebug.cs Perform a global search in your solution replacing Debug.Assert with MyDebug.Assert Build solution and test Now, I do not know why this functionality was broken, but I do know that it exhibits the same broken characteristics for Windows Store apps. There is a simple workaround there to use Contract.Assert which does display a message and offers an option to break in the debugger (although it doesn’t output the message to the Output window). Because I plan on code sharing between Phone and Windows 8 projects, I prefer to have the conditional compilation centralized, so I added the Contract.Assert workaround directly in MyDebug class, so that you can use this class for both platforms – enjoy and enhance! Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Commit in SQL

    - by PRajkumar
    SQL Transaction Control Language Commands (TCL)                                           (COMMIT) Commit Transaction As a SQL language we use transaction control language very frequently. Committing a transaction means making permanent the changes performed by the SQL statements within the transaction. A transaction is a sequence of SQL statements that Oracle Database treats as a single unit. This statement also erases all save points in the transaction and releases transaction locks. Oracle Database issues an implicit COMMIT before and after any data definition language (DDL) statement. Oracle recommends that you explicitly end every transaction in your application programs with a COMMIT or ROLLBACK statement, including the last transaction, before disconnecting from Oracle Database. If you do not explicitly commit the transaction and the program terminates abnormally, then the last uncommitted transaction is automatically rolled back.   Until you commit a transaction: ·         You can see any changes you have made during the transaction by querying the modified tables, but other users cannot see the changes. After you commit the transaction, the changes are visible to other users' statements that execute after the commit ·         You can roll back (undo) any changes made during the transaction with the ROLLBACK statement   Note: Most of the people think that when we type commit data or changes of what you have made has been written to data files, but this is wrong when you type commit it means that you are saying that your job has been completed and respective verification will be done by oracle engine that means it checks whether your transaction achieved consistency when it finds ok it sends a commit message to the user from log buffer but not from data buffer, so after writing data in log buffer it insists data buffer to write data in to data files, this is how it works.   Before a transaction that modifies data is committed, the following has occurred: ·         Oracle has generated undo information. The undo information contains the old data values changed by the SQL statements of the transaction ·         Oracle has generated redo log entries in the redo log buffer of the System Global Area (SGA). The redo log record contains the change to the data block and the change to the rollback block. These changes may go to disk before a transaction is committed ·         The changes have been made to the database buffers of the SGA. These changes may go to disk before a transaction is committed   Note:   The data changes for a committed transaction, stored in the database buffers of the SGA, are not necessarily written immediately to the data files by the database writer (DBWn) background process. This writing takes place when it is most efficient for the database to do so. It can happen before the transaction commits or, alternatively, it can happen some times after the transaction commits.   When a transaction is committed, the following occurs: 1.      The internal transaction table for the associated undo table space records that the transaction has committed, and the corresponding unique system change number (SCN) of the transaction is assigned and recorded in the table 2.      The log writer process (LGWR) writes redo log entries in the SGA's redo log buffers to the redo log file. It also writes the transaction's SCN to the redo log file. This atomic event constitutes the commit of the transaction 3.      Oracle releases locks held on rows and tables 4.      Oracle marks the transaction complete   Note:   The default behavior is for LGWR to write redo to the online redo log files synchronously and for transactions to wait for the redo to go to disk before returning a commit to the user. However, for lower transaction commit latency application developers can specify that redo be written asynchronously and that transaction do not need to wait for the redo to be on disk.   The syntax of Commit Statement is   COMMIT [WORK] [COMMENT ‘your comment’]; ·         WORK is optional. The WORK keyword is supported for compliance with standard SQL. The statements COMMIT and COMMIT WORK are equivalent. Examples Committing an Insert INSERT INTO table_name VALUES (val1, val2); COMMIT WORK; ·         COMMENT Comment is also optional. This clause is supported for backward compatibility. Oracle recommends that you used named transactions instead of commit comments. Specify a comment to be associated with the current transaction. The 'text' is a quoted literal of up to 255 bytes that Oracle Database stores in the data dictionary view DBA_2PC_PENDING along with the transaction ID if a distributed transaction becomes in doubt. This comment can help you diagnose the failure of a distributed transaction. Examples The following statement commits the current transaction and associates a comment with it: COMMIT     COMMENT 'In-doubt transaction Code 36, Call (415) 555-2637'; ·         WRITE Clause Use this clause to specify the priority with which the redo information generated by the commit operation is written to the redo log. This clause can improve performance by reducing latency, thus eliminating the wait for an I/O to the redo log. Use this clause to improve response time in environments with stringent response time requirements where the following conditions apply: The volume of update transactions is large, requiring that the redo log be written to disk frequently. The application can tolerate the loss of an asynchronously committed transaction. The latency contributed by waiting for the redo log write to occur contributes significantly to overall response time. You can specify the WAIT | NOWAIT and IMMEDIATE | BATCH clauses in any order. Examples To commit the same insert operation and instruct the database to buffer the change to the redo log, without initiating disk I/O, use the following COMMIT statement: COMMIT WRITE BATCH; Note: If you omit this clause, then the behavior of the commit operation is controlled by the COMMIT_WRITE initialization parameter, if it has been set. The default value of the parameter is the same as the default for this clause. Therefore, if the parameter has not been set and you omit this clause, then commit records are written to disk before control is returned to the user. WAIT | NOWAIT Use these clauses to specify when control returns to the user. The WAIT parameter ensures that the commit will return only after the corresponding redo is persistent in the online redo log. Whether in BATCH or IMMEDIATE mode, when the client receives a successful return from this COMMIT statement, the transaction has been committed to durable media. A crash occurring after a successful write to the log can prevent the success message from returning to the client. In this case the client cannot tell whether or not the transaction committed. The NOWAIT parameter causes the commit to return to the client whether or not the write to the redo log has completed. This behavior can increase transaction throughput. With the WAIT parameter, if the commit message is received, then you can be sure that no data has been lost. Caution: With NOWAIT, a crash occurring after the commit message is received, but before the redo log record(s) are written, can falsely indicate to a transaction that its changes are persistent. If you omit this clause, then the transaction commits with the WAIT behavior. IMMEDIATE | BATCH Use these clauses to specify when the redo is written to the log. The IMMEDIATE parameter causes the log writer process (LGWR) to write the transaction's redo information to the log. This operation option forces a disk I/O, so it can reduce transaction throughput. The BATCH parameter causes the redo to be buffered to the redo log, along with other concurrently executing transactions. When sufficient redo information is collected, a disk write of the redo log is initiated. This behavior is called "group commit", as redo for multiple transactions is written to the log in a single I/O operation. If you omit this clause, then the transaction commits with the IMMEDIATE behavior. ·         FORCE Clause Use this clause to manually commit an in-doubt distributed transaction or a corrupt transaction. ·         In a distributed database system, the FORCE string [, integer] clause lets you manually commit an in-doubt distributed transaction. The transaction is identified by the 'string' containing its local or global transaction ID. To find the IDs of such transactions, query the data dictionary view DBA_2PC_PENDING. You can use integer to specifically assign the transaction a system change number (SCN). If you omit integer, then the transaction is committed using the current SCN. ·         The FORCE CORRUPT_XID 'string' clause lets you manually commit a single corrupt transaction, where string is the ID of the corrupt transaction. Query the V$CORRUPT_XID_LIST data dictionary view to find the transaction IDs of corrupt transactions. You must have DBA privileges to view the V$CORRUPT_XID_LIST and to specify this clause. ·         Specify FORCE CORRUPT_XID_ALL to manually commit all corrupt transactions. You must have DBA privileges to specify this clause. Examples Forcing an in doubt transaction. Example The following statement manually commits a hypothetical in-doubt distributed transaction. Query the V$CORRUPT_XID_LIST data dictionary view to find the transaction IDs of corrupt transactions. You must have DBA privileges to view the V$CORRUPT_XID_LIST and to issue this statement. COMMIT FORCE '22.57.53';

    Read the article

  • IPV6 auto configuration not working

    - by Allan Ruin
    In Windows 7, my computer can automatically get a IPV6 global address and use IPV6 network, but in Ubuntu Natty, I can't find out how to let stateless configuration work. My network is a university campus network,so I don't need tunnels. I think if one thing can silently and successfully be accomplished in Windows, it shouldn't be impossible in linux. I tried manually editing /etc/network/interfaces and used a static IPV6 address, and I can use IPV6 this way, but I just want to use auto-configuration. I found this post: http://superuser.com/questions/33196/how-to-disable-autoconfiguration-on-ipv6-in-linux and tried sudo sysctl -w net.ipv6.conf.all.autoconf=1 sudo sysctl -w net.ipv6.conf.all.accept_ra=1 but without any luck. I got this in dmesg: root@natty-150:~# dmesg |grep IPv6 [ 26.239607] eth0: no IPv6 routers present [ 657.365194] eth0: no IPv6 routers present [ 719.101383] eth0: no IPv6 routers present [32864.604234] eth0: no IPv6 routers present [33267.619767] eth0: no IPv6 routers present [33341.507307] eth0: no IPv6 routers present I am not sure whether it matters,but then I setup a static IPv6 address (with gateway) and restart network,I ping6 ipv6.google.com and the ipv6 network is fine.This time a entry was added in dmesg [33971.214920] eth0: no IPv6 routers present So I guess the complain of no IPv6 router does not matter? Here is the ipv6 forwarding setting.But I guessed forwarding is used for radvd stuff? root@natty-150:/# cat /proc/sys/net/ipv6/conf/eth0/forwarding 0 After ajmitch mentioned forwarding setting, I added this to sysctl.conf file: net.ipv6.conf.all.autoconf = 1 net.ipv6.conf.all.accept_ra = 1 net.ipv6.conf.default.forwarding = 1 net.ipv6.conf.lo.forwarding = 1 net.ipv6.conf.eth0.forwarding = 1 and then ran sysctl -p /etc/init.d/networking restart But this still doesn't work.

    Read the article

< Previous Page | 254 255 256 257 258 259 260 261 262 263 264 265  | Next Page >