Search Results

Search found 881 results on 36 pages for 'audit trail'.

Page 16/36 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Overview of SOA Diagnostics in 11.1.1.6

    - by ShawnBailey
    What tools are available for diagnosing SOA Suite issues? There are a variety of tools available to help you and Support diagnose SOA Suite issues in 11g but it can be confusing as to which tool is appropriate for a particular situation and what their relationships are. This blog post will introduce the various tools and attempt to clarify what each is for and how they are related. Let's first list the tools we'll be addressing: RDA: Remote Diagnostic Agent DFW: Diagnostic Framework Selective Tracing DMS: Dynamic Monitoring Service ODL: Oracle Diagnostic Logging ADR: Automatic Diagnostics Repository ADRCI: Automatic Diagnostics Repository Command Interpreter WLDF: WebLogic Diagnostic Framework This overview is not mean to be a comprehensive guide on using all of these tools, however, extensive reference materials are included that will provide many more details on their execution. Another point to note is that all of these tools are applicable for Fusion Middleware as a whole but specific products may or may not have implemented features to leverage them. A couple of the tools have a WebLogic Scripting Tool or 'WLST' interface. WLST is a command interface for executing pre-built functions and custom scripts against a domain. A detailed WLST tutorial is beyond the scope of this post but you can find general information here. There are more specific resources in the below sections. In this post when we refer to 'Enterprise Manager' or 'EM' we are referring to Enterprise Manager Fusion Middleware Control. RDA (Remote Diagnostic Agent) RDA is a standalone tool that is used to collect both static configuration and dynamic runtime information from the SOA environment. RDA is generally run manually from the command line against a domain or single server. When opening a new Service Request, including an RDA collection can dramatically decrease the back and forth required to collect logs and configuration information for Support. After installing RDA you configure it to use the SOA Suite module as decribed in the referenced resources. The SOA module includes the Oracle WebLogic Server (WLS) module by default in order to include all of the relevant information for the environment. In addition to this basic configuration there is also an advanced mode where you can set the number of thread dumps for the collections, log files, Incidents, etc. When would you use it? When creating a Service Request or otherwise working with Oracle resources on an issue, capturing environment snapshots to baseline your configuration or to diagnose an issue on your own. How is it related to the other tools? RDA is related to DFW in that it collects the last 10 Incidents from the server by default. In a similar manner, RDA is related to ODL through its collection of the diagnostic logs and these may contain information from Selective Tracing sessions. Examples of what it currently collects: (for details please see the links in the Resources section) Diagnostic Logs (ODL) Diagnostic Framework Incidents (DFW) SOA MDS Deployment Descriptors SOA Repository Summary Statistics Thread Dumps Complete Domain Configuration RDA Resources: Webcast Recording: Using RDA with Oracle SOA Suite 11g Blog Post: Diagnose SOA Suite 11g Issues Using RDA Download RDA How to Collect Analysis Information Using RDA for Oracle SOA Suite 11g Products [ID 1350313.1] How to Collect Analysis Information Using RDA for Oracle SOA Suite and BPEL Process Manager 11g [ID 1352181.1] Getting Started With Remote Diagnostic Agent: Case Study - Oracle WebLogic Server (Video) [ID 1262157.1] top DFW (Diagnostic Framework) DFW provides the ability to collect specific information for a particular problem when that problem occurs. DFW is included with your SOA Suite installation and deployed to the domain. Let's define the components of DFW. Diagnostic Dumps: Specific diagnostic collections that are defined at either the 'system' or product level. Examples would be diagnostic logs or thread dumps. Incident: A collection of Diagnostic Dumps associated with a particular problem Log Conditions: An Oracle Diagnostic Logging event that DFW is configured to listen for. If the event is identified then an Incident will be created. WLDF Watch: The WebLogic Diagnostic Framework or 'WLDF' is not a component of DFW, however, it can be a source of DFW Incident creation through the use of a 'Watch'. WLDF Notification: A Notification is a component of WLDF and is the link between the Watch and DFW. You can configure multiple Notification types in WLDF and associate them with your Watches. 'FMWDFW-notification' is available to you out of the box to allow for DFW notification of Watch execution. Rule: Defines a WLDF Watch or Log Condition for which we want to associate a set of Diagnostic Dumps. When triggered the specified dumps will be collected and added to the Incident Rule Action: Defines the specific Diagnostic Dumps to collect for a particular rule ADR: Automatic Diagnostics Repository; Defined for every server in a domain. This is where Incidents are stored Now let's walk through a simple flow: Oracle Web Services error message OWS-04086 (SOAP Fault) is generated on managed server 1 DFW Log Condition for OWS-04086 evaluates to TRUE DFW creates a new Incident in the ADR for managed server 1 DFW executes the specified Diagnostic Dumps and adds the output to the Incident In this case we'll grab the diagnostic log and thread dump. We might also want to collect the WSDL binding information and SOA audit trail When would you use it? When you want to automatically collect Diagnostic Dumps at a particular time using a trigger or when you want to manually collect the information. In either case it can be readily uploaded to Oracle Support through the Service Request. How is it related to the other tools? DFW generates Incidents which are collections of Diagnostic Dumps. One of the system level Diagonstic Dumps collects the current server diagnostic log which is generated by ODL and can contain information from Selective Tracing sessions. Incidents are included in RDA collections by default and ADRCI is a tool that is used to package an Incident for upload to Oracle Support. In addition, both ODL and DMS can be used to trigger Incident creation through DFW. The conditions and rules for generating Incidents can become quite complicated and the below resources go into more detail. A simpler approach to leveraging at least the Diagnostic Dumps is through WLST (WebLogic Scripting Tool) where there are commands to do the following: Create an Incident Execute a single Diagnostic Dump Describe a Diagnostic Dump List the available Diagnostic Dumps The WLST option offers greater control in what is generated and when. It can be a great help when collecting information for Support. There are overlaps with RDA, however, DFW is geared towards collecting specific runtime information when an issue occurs while existing Incidents are collected by RDA. There are 3 WLDF Watches configured by default in a SOA Suite 11g domain: Stuck Threads, Unchecked Exception and Deadlock. These Watches are enabled by default and will generate Incidents in ADR. They are configured to reset automatically after 30 seconds so they have the potential to create multiple Incidents if these conditions are consistent. The Incidents generated by these Watches will only contain System level Diagnostic Dumps. These same System level Diagnostic Dumps will be included in any application scoped Incident as well. Starting in 11.1.1.6, SOA Suite is including its own set of application scoped Diagnostic Dumps that can be executed from WLST or through a WLDF Watch or Log Condition. These Diagnostic Dumps can be added to an Incident such as in the earlier example using the error code OWS-04086. soa.config: MDS configuration files and deployed-composites.xml soa.composite: All artifacts related to the deployed composite soa.wsdl: Summary of endpoints configured for the composite soa.edn: EDN configuration summary if applicable soa.db: Summary DB information for the SOA repository soa.env: Coherence cluster configuration summary soa.composite.trail: Partial audit trail information for the running composite The current release of RDA has the option to collect the soa.wsdl and soa.composite Diagnostic Dumps. More Diagnostic Dumps for SOA Suite products are planned for future releases along with enhancements to DFW itself. DFW Resources: Webcast Recording: SOA Diagnostics Sessions: Diagnostic Framework Diagnostic Framework Documentation DFW WLST Command Reference Documentation for SOA Diagnostic Dumps in 11.1.1.6 top Selective Tracing Selective Tracing is a facility available starting in version 11.1.1.4 that allows you to increase the logging level for specific loggers and for a specific context. What this means is that you have greater capability to collect needed diagnostic log information in a production environment with reduced overhead. For example, a Selective Tracing session can be executed that only increases the log level for one composite, only one logger, limited to one server in the cluster and for a preset period of time. In an environment where dozens of composites are deployed this can dramatically reduce the volume and overhead of the logging without sacrificing relevance. Selective Tracing can be administered either from Enterprise Manager or through WLST. WLST provides a bit more flexibility in terms of exactly where the tracing is run. When would you use it? When there is an issue in production or another environment that lends itself to filtering by an available context criteria and increasing the log level globally results in too much overhead or irrelevant information. The information is written to the server diagnostic log and is exportable from Enterprise Manager How is it related to the other tools? Selective Tracing output is written to the server diagnostic log. This log can be collected by a system level Diagnostic Dump using DFW or through a default RDA collection. Selective Tracing also heavily leverages ODL fields to determine what to trace and to tag information that is part of a particular tracing session. Available Context Criteria: Application Name Client Address Client Host Composite Name User Name Web Service Name Web Service Port Selective Tracing Resources: Webcast Recording: SOA Diagnostics Session: Using Selective Tracing to Diagnose SOA Suite Issues How to Use Selective Tracing for SOA [ID 1367174.1] Selective Tracing WLST Reference top DMS (Dynamic Monitoring Service) DMS exposes runtime information for monitoring. This information can be monitored in two ways: Through the DMS servlet As exposed MBeans The servlet is deployed by default and can be accessed through http://<host>:<port>/dms/Spy (use administrative credentials to access). The landing page of the servlet shows identical columns of what are known as Noun Types. If you select a Noun Type you will see a table in the right frame that shows the attributes (Sensors) for the Noun Type and the available instances. SOA Suite has several exposed Noun Types that are available for viewing through the Spy servlet. Screenshots of the Spy servlet are available in the Knowledge Base article How to Monitor Runtime SOA Performance With the Dynamic Monitoring Service (DMS). Every Noun instance in the runtime is exposed as an MBean instance. As such they are generally available through an MBean browser and available for monitoring through WLDF. You can configure a WLDF Watch to monitor a particular attribute and fire a notification when the threshold is exceeded. A WLDF Watch can use the out of the box DFW notification type to notify DFW to create an Incident. When would you use it? When you want to monitor a metric or set of metrics either manually or through an automated system. When you want to trigger a WLDF Watch based on a metric exposed through DMS. How is it related to the other tools? DMS metrics can be monitored with WLDF Watches which can in turn notify DFW to create an Incident. DMS Resources: How to Monitor Runtime SOA Performance With the Dynamic Monitoring Service (DMS) [ID 1368291.1] How to Reset a SOA 11g DMS Metric DMS Documentation top ODL (Oracle Diagnostic Logging) ODL is the primary facility for most Fusion Middleware applications to log what they are doing. Whenever you change a logging level through Enterprise Manager it is ultimately exposed through ODL and written to the server diagnostic log. A notable exception to this is WebLogic Server which uses its own log format / file. ODL logs entries in a consistent, structured way using predefined fields and name/value pairs. Here's an example of a SOA Suite entry: [2012-04-25T12:49:28.083-06:00] [AdminServer] [ERROR] [] [oracle.soa.bpel.engine] [tid: [ACTIVE].ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: ] [ecid: 0963fdde7e77631c:-31a6431d:136eaa46cda:-8000-00000000000000b4,0] [errid: 41] [WEBSERVICE_PORT.name: BPELProcess2_pt] [APP: soa-infra] [composite_name: TestProject2] [J2EE_MODULE.name: fabric] [WEBSERVICE.name: bpelprocess1_client_ep] [J2EE_APP.name: soa-infra] Error occured while handling a post operation[[ When would you use it? You'll use ODL almost every time you want to identify and diagnose a problem in the environment. The entries are written to the server diagnostic log. How is it related to the other tools? The server diagnostic logs are collected by DFW and RDA. Selective Tracing writes its information to the diagnostic log as well. Additionally, DFW log conditions are triggered by ODL log events. ODL Resources: ODL Documentation top ADR (Automatic Diagnostics Repository) ADR is not a tool in and of itself but is where DFW stores the Incidents it creates. Every server in the domain has an ADR location which can be found under <SERVER_HOME>/adr. This is referred to the as the ADR 'Base' location. ADR also has what are known as 'Home' locations. Example: You have a domain called 'myDomain' and an associated managed server called 'myServer'. Your admin server is called 'AdminServer'. Your domain home directory is called 'myDomain' and it contains a 'servers' directory. The 'servers' directory contains a directory for the managed server called 'myServer' and here is where you'll find the 'adr' directory which is the ADR 'Base' location for myServer. To get to the ADR 'Home' locations we drill through a few levels: diag/ofm/myDomain/ In an 11.1.1.6 SOA Suite domain you will see 2 directories here, 'myServer' and 'soa-infra'. These are the ADR 'Home' locations. 'myServer' is the 'system' ADR home and contains system level Incidents. 'soa-infra' is the name that SOA Suite used to register with DFW and this ADR home contains SOA Suite related Incidents Each ADR home location contains a series of directories, one of which is called 'incident'. This is where your Incidents are stored. When would you use it? It's a good idea to check on these locations from time to time to see whether a lot of Incidents are being generated. They can be cleaned out by deleting the Incident directories or through the ADRCI tool. If you know that an Incident is of particular interest for an issue you're working with Oracle you can simply zip it up and provide it. How does it relate to the other tools? ADR is obviously very important for DFW since it's where the Incidents are stored. Incidents contain Diagnostic Dumps that may relate to diagnostic logs (ODL) and DMS metrics. The most recent 10 Incident directories are collected by RDA by default and ADRCI relies on the ADR locations to help manage the contents. top ADRCI (Automatic Diagnostics Repository Command Interpreter) ADRCI is a command line tool for packaging and managing Incidents. When would you use it? When purging Incidents from an ADR Home location or when you want to package an Incident along with an offline RDA collection for upload to Oracle Support. How does it relate to the other tools? ADRCI contains a tool called the Incident Packaging System or IPS. This is used to package an Incident for upload to Oracle Support through a Service Request. Starting in 11.1.1.6 IPS will attempt to collect an offline RDA collection and include it with the Incident package. This will only work if Perl is available on the path, otherwise it will give a warning and package only the Incident files. ADRCI Resources: How to Use the Incident Packaging System (IPS) in SOA 11g [ID 1381259.1] ADRCI Documentation top WLDF (WebLogic Diagnostic Framework) WLDF is functionality available in WebLogic Server since version 9. Starting with FMw 11g a link has been added between WLDF and the pre-existing DFW, the WLDF Watch Notification. Let's take a closer look at the flow: There is a need to monitor the performance of your SOA Suite message processing A WLDF Watch is created in the WLS console that will trigger if the average message processing time exceeds 2 seconds. This metric is monitored through a DMS MBean instance. The out of the box DFW Notification (the Notification is called FMWDFW-notification) is added to the Watch. Under the covers this notification is of type JMX. The Watch is triggered when the threshold is exceeded and fires the Notification. DFW has a listener that picks up the Notification and evaluates it according to its rules, etc When it comes to automatic Incident creation, WLDF is a key component with capabilities that will grow over time. When would you use it? When you want to monitor the WLS server log or an MBean metric for some condition and fire a notification when the Watch is triggered. How does it relate to the other tools? WLDF is used to automatically trigger Incident creation through DFW using the DFW Notification. WLDF Resources: How to Monitor Runtime SOA Performance With the Dynamic Monitoring Service (DMS) [ID 1368291.1] How To Script the Creation of a SOA WLDF Watch in 11g [ID 1377986.1] WLDF Documentation top

    Read the article

  • CodePlex Daily Summary for Tuesday, June 03, 2014

    CodePlex Daily Summary for Tuesday, June 03, 2014Popular ReleasesQuickMon: Version 3.14 (Pie release): This is unofficially the 'Pie' release. There are two big changes.1. 'Presets' - basically templates. Future releases might build on this to allow users to add more presets. 2. MSI Installer now allows you to choose components (in case you don't want all collectors etc.). This means you don't have to download separate components anymore (AllAgents.zip still included in case you want to use them separately) Some other changes:1. Add/changed default file extension for monitor packs to *.qmp (...VeraCrypt: VeraCrypt version 1.0d: Changes between 1.0c and 1.0d (03 June 2014) : Correct issue while creating hidden operating system. Minor fixes (look at git history for more details).Keepass2Android: 0.9.4-pre1: added plug-in support: See settings for how to get plug-ins! published QR plug-in (scan passwords, display passwords as QR code, transfer entries to other KP2A devices) published InputStick plugin (transfer credentials to your PC via bluetooth - requires InputStick USB stick) Third party apps can now simply implement querying KP2A for credentials. Are you a developer? Please add this to your app if suitable! added TOTP support (compatible with KeeOTP and TrayTotp) app should no l...Microsoft Web Protection Library: AntiXss Library 4.3.0: Download from http://www.microsoft.com/en-us/download/details.aspx?id=43126 This issue finally addresses the over zealous behaviour of the HTML Sanitizer which should now function as expected once again. HTML encoding has been changed to safelist a few more characters for webforms compatibility. This will be the last version of AntiXSS that contains a sanitizer. Any new releases will be encoding libraries only. We recommend you explore other sanitizer options, for example AntiSamy htt...Z SqlBulkCopy Extensions: SqlBulkCopy Extensions 1.0.0: SqlBulkCopy Extensions provide MUST-HAVE methods with outstanding performance missing from the SqlBulkCopy class like Delete, Update, Merge, Upsert. Compatible with .NET 2.0, SQL Server 2000, SQL Azure and more! Bulk MethodsBulkDelete BulkInsert BulkMerge BulkUpdate BulkUpsert Utility MethodsGetSqlConnection GetSqlTransaction You like this library? Find out how and why you should support Z Project Become a Memberhttp://zzzproject.com/resources/images/all/become-a-member.png|ht...Portable Class Library for SQLite: Portable Class Library for SQLite - 3.8.4.4: This pull request from mattleibow addresses an issue with custom function creation (define functions in C# code and invoke them from SQLite as id they where regular SQL functions). Impact: Xamarin iOSTweetinvi a friendly Twitter C# API: Tweetinvi 0.9.3.x: Timelines- Added all the parameters available from the Timeline Endpoints in Tweetinvi. - This is available for HomeTimeline, UserTimeline, MentionsTimeline // Simple query var tweets = Timeline.GetHomeTimeline(); // Create a parameter for queries with specific parameters var timelineParameter = Timeline.CreateHomeTimelineRequestParameter(); timelineParameter.ExcludeReplies = true; timelineParameter.TrimUser = true; var tweets = Timeline.GetHomeTimeline(timelineParameter); Tweetinvi 0.9.3.1...Sandcastle Help File Builder: Help File Builder and Tools v2014.5.31.0: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release completes removal of the branding transformations and implements the new VS2013 presentation style that utilizes the new lightweight website format. Several breaking cha...Image View Slider: Image View Slider: This is a .NET component. We create this using VB.NET. Here you can use an Image Viewer with several properties to your application form. We wish somebody to improve freely. Try this out! Author : Steven Renaldo Antony Yustinus Arjuna Purnama Putra Andre Wijaya P Martin Lidau PBK GENAP 2014 - TI UKDWAspose for Apache POI: Missing Features of Apache POI WP - v 1.1: Release contain the Missing Features in Apache POI WP SDK in Comparison with Aspose.Words for dealing with Microsoft Word. What's New ?Following Examples: Insert Picture in Word Document Insert Comments Set Page Borders Mail Merge from XML Data Source Moving the Cursor Feedback and Suggestions Many more examples are yet to come here. Keep visiting us. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.babelua: V1.5.6.0: V1.5.6.0 - 2014.5.30New feature: support quick-cocos2d-x project now; support text search in scripts folder now, you can use this function in Search Result Window;Credit Component: Credit Component: This is a sample release of Credit Component that has been made by Microsoft Visual Studio 2010. To try and use it, you need .NET framework 4.0 and Microsoft Visual Studio 2010 or newer as a minimum requirement in this download you will get media player as a sample application that use this component credit component as a main component media player source code as source code and sample usage of credit component credit component source code as source code of credit component important...SEToolbox: 01.032.014 Release 1: Added fix when loading game Textures for icons causing 'Unable to read beyond the end of the stream'. Added new Resource Report, that displays all in game resources in a concise report. Added in temp directory cleaner, to keep excess files from building up. Fixed use of colors on the windows, to work better with desktop schemes. Adding base support for multilingual resources. This will allow loading of the Space Engineers resources to show localized names, and display localized date a...ClosedXML - The easy way to OpenXML: ClosedXML 0.71.2: More memory and performance improvements. Fixed an issue with pivot table field order.Composite Iconote: Composite Iconote: This is a composite has been made by Microsoft Visual Studio 2013. Requirement: To develop this composite or use this component in your application, your computer must have .NET framework 4.5 or newer.Magick.NET: Magick.NET 6.8.9.101: Magick.NET linked with ImageMagick 6.8.9.1. Breaking changes: - Int/short Set methods of WritablePixelCollection are now unsigned. - The Q16 build no longer uses HDRI, switch to the new Q16-HDRI build if you need HDRI.fnr.exe - Find And Replace Tool: 1.7: Bug fixes Refactored logic for encoding text values to command line to handle common edge cases where find/replace operation works in GUI but not in command line Fix for bug where selection in Encoding drop down was different when generating command line in some cases. It was reported in: https://findandreplace.codeplex.com/workitem/34 Fix for "Backslash inserted before dot in replacement text" reported here: https://findandreplace.codeplex.com/discussions/541024 Fix for finding replacing...VG-Ripper & PG-Ripper: VG-Ripper 2.9.59: changes NEW: Added Support for 'GokoImage.com' links NEW: Added Support for 'ViperII.com' links NEW: Added Support for 'PixxxView.com' links NEW: Added Support for 'ImgRex.com' links NEW: Added Support for 'PixLiv.com' links NEW: Added Support for 'imgsee.me' links NEW: Added Support for 'ImgS.it' linksToolbox for Dynamics CRM 2011/2013: XrmToolBox (v1.2014.5.28): XrmToolbox improvement XrmToolBox updates (v1.2014.5.28)Fix connecting to a connection with custom authentication without saved password Tools improvement New tool!Solution Components Mover (v1.2014.5.22) Transfer solution components from one solution to another one Import/Export NN relationships (v1.2014.3.7) Allows you to import and export many to many relationships Tools updatesAttribute Bulk Updater (v1.2014.5.28) Audit Center (v1.2014.5.28) View Layout Replicator (v1.2014.5.28) Scrip...Microsoft Ajax Minifier: Microsoft Ajax Minifier 5.10: Fix for Issue #20875 - echo switch doesn't work for CSS CSS should honor the SASS source-file comments JS should allow multi-line comment directivesNew ProjectsAirline Management Solutions: Three layers architecture PHP Maria DB Metro-StyleBAOnline: tttboomteam: Fitness videoscsv2xlsx: this project was created to simplify process of converting csv text files to Excel tables. It uses Apache POI to work with Excel Hazza.ShapeField: Adds a field that lets you input the name of a shape to be displayed.HP AGM Monitor Service: Just for internal usage.HP AGM RestAPI Wrapper: HP AGM Rest API .Net WrapperIO Performance Verifier: IO Performance verifier is for verifying IO from fx SAN/NAS in a virtualized environment on Windows servers. Useful to verify SLA or configuration change effectIRIS Tutorials: This a repository of tutorials for the IRIS Toolbox project.Node Service Host: Host application to run node app as a service. Runs service as root with app as specified user, restarts, logging. Linux, OSX and windows.SEND SMS ALERT FROM YOUR SOFTWARE / WEBSITE: SMS API allows you to send SMS to all mobile operators across Pakistan or any other country at very very cheap rates. It allows you to send SMS through http://CSharePoint Audit Facilities Demo: Sample demo code for SharePoint Audit Log extraction and methods used. SharePoint Permission Analyzer: Permission Analyzer will scan through a SharePoint site collection and create a permission structure of the site. Works on SharePoint 2010 and SP 2013Spotify WinRT Component: WinRT Component wrapper for libspotify https://developer.spotify.com/technologies/libspotify/Windows API Interop Library: A collection of interop code in C# for the Windows API. Key exported methods, constants and structures defined. Some extension methods for WinForms controls.WPF Pricing unit: await async wpf Entity framework 6zzswire: ????

    Read the article

  • Issue in Creating an Insert Query See Description Below...

    - by Parth
    I am creating a Insert Query using PHP.. By fetching the data from a Audit table and iterating the values of it in loops.. table from which I am fetching the value has the snapshot below: The Code I am using to create is given below: mysql_select_db('information_schema'); $select = mysql_query("SELECT TABLE_NAME FROM TABLES WHERE TABLE_SCHEMA = 'pranav_test'"); $selectclumn = mysql_query("SELECT * FROM COLUMNS WHERE TABLE_SCHEMA = 'pranav_test'"); mysql_select_db('pranav_test'); $seletaudit = mysql_query("SELECT * FROM jos_audittrail WHERE live = 0"); $tables = array(); $i = 0; while($row = mysql_fetch_array($select)) { $tables[$i++] =$row['TABLE_NAME']; } while($row2 = mysql_fetch_array($seletaudit)) { $audit[] =$row2; } foreach($audit as $val) { if($val['operation'] == "INSERT") { if(in_array($val['table_name'],$tables)) { $insert = "INSERT INTO '".$val['table_name']."' ("; $selfld = mysql_query("SELECT field FROM jos_audittrail WHERE table_name = '".$val['table_name']."' AND operation = 'INSERT' AND trackid = '".$val['trackid']."'"); while($row3 = mysql_fetch_array($selfld)) { $values[] = $row3; } foreach($values as $field) { $insert .= "'".$field['field']."', "; } $insert .= "]"; $insert = str_replace(", ]",")",$insert); $insert .= " values ("; $selval = mysql_query("SELECT newvalue FROM jos_audittrail WHERE table_name = '".$val['table_name']."' AND operation = 'INSERT' AND trackid = '".$val['trackid']."' AND live = 0"); while($row4 = mysql_fetch_array($selval)) { $value[] = $row4; } /*echo "<pre>"; print_r($value);exit;*/ foreach($value as $data) { $insert .= "'".$data['newvalue']."', "; } $insert .= "["; $insert = str_replace(", [",")",$insert); } } } When I Echo the $insert out of the most outer for loop (for auditrail) The values get printed as many times as the records are found for the outer for loop..i.e 'orderby= show_noauth= show_title= link_titles= show_intro= show_section= link_section= show_category= link_category= show_author= show_create_date= show_modify_date= show_item_navigation= show_readmore= show_vote= show_icons= show_pdf_icon= show_print_icon= show_email_icon= show_hits= feed_summary= page_title= show_page_title=1 pageclass_sfx= menu_image=-1 secure=0 ', '0000-00-00 00:00:00', '13', '20', '1', '152', 'accmenu', 'IPL', 'ipl', 'index.php?option=com_content&view=archive', 'component' gets repeated , i.e. INSERT INTO 'jos_menu' ('params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type', 'params', 'checked_out_time', 'ordering', 'componentid', 'published', 'id', 'menutype', 'name', 'alias', 'link', 'type') values ('orderby= show_noauth= show_title= link_titles= show_intro= show_section= link_section= show_category= link_category= show_author= show_create_date= show_modify_date= show_item_navigation= show_readmore= show_vote= show_icons= show_pdf_icon= show_print_icon= show_email_icon= show_hits= feed_summary= page_title= show_page_title=1 pageclass_sfx= menu_image=-1 secure=0 ', '0000-00-00 00:00:00', '13', '20', '1', '152', 'accmenu', 'IPL', 'ipl', 'index.php?option=com_content&view=archive', 'component', 'orderby= show_noauth= show_title= link_titles= show_intro= show_section= link_section= show_category= link_category= show_author= show_create_date= show_modify_date= show_item_navigation= show_readmore= show_vote= show_icons= show_pdf_icon= show_print_icon= show_email_icon= show_hits= feed_summary= page_title= show_page_title=1 pageclass_sfx= menu_image=-1 secure=0 ', '0000-00-00 00:00:00', '13', '20', '1', '152', 'accmenu', 'IPL', 'ipl', 'index.php?option=com_content&view=archive', 'component', 'orderby= show_noauth= .. .. .. .. and so on What I want is I should get these Values for once, I know there is mistake using the outer Forloop, but I m not getting the idea of rectifying it.. Please help... please poke me for more clarification...

    Read the article

  • How can I make vim show the current class and method I'm editing

    - by dcrosta
    Does anyone know if it's possible (or know of an existing vim script or plugin) that can create a "status bar" that shows the name of the current class and method (or function) I'm editing? I'm imagining that it would plug into the syntax parser for the filetype of the current buffer, and display a breadcrumb trail to show you what you're currently editing. I don't know vimscript well enough to suggest any more than that, but if there aren't any good solutions already, I may begin to hack on one, so suggestions as to where to start are welcome, too!

    Read the article

  • OWB 11gR2 - Find and Search Metadata in Designer

    - by David Allan
    Here are some tools and techniques for finding objects, specifically in the design repository. There are ways of navigating and collating objects that are useful for day to day development and build-time usage - this includes features out of the box and utilities constructed on top. There are a variety of techniques to navigate and find objects in the repository, the first 3 are out of the box, the 4th is an expert utility. Navigating by the tree, grouping by project and module - ok if you are aware of the exact module/folder that objects reside in. The structure panel is a useful way of finding parts of an object, especially when large rather than using the canvas. In large scale projects it helps to have accelerators (either find or collections below). Advanced find to search by name - 11gR2 included a find capability specifically for large scale projects. There were improvements in both the tree search and the object editors (including highlighting in mapping for example). So you can now do regular expression based search and quickly navigate to objects within a repository. Collections - logically organize your objects into virtual folders by shortcutting the actual objects. This is useful for a range of things since all the OWB services operate on collections too (export/import, validation, deployment). See the post here for new collection functionality in 11gR2. Reports for searching by type, updated on, updated by etc. Useful for activities such as periodic incremental actions (deploy all mappings changed in the past week). The report style view is useful since I can quickly see who changed what and when. You can see all the audit details for objects within each objects property inspector, but its useful to just get all objects changed today or example, all objects changed since my last build etc. This utility combines both UI extensions via experts and the public views on the repository. In the figure to the right you see the contextual option 'Object Search' which invokes the utility, you can see I have quite a number of modules within my project. Figure out all the potential objects which have been changed is not simple. The utility is an expert which provides this kind of search capability. The utility provides a report of the objects in the design repository which satisfy some filter criteria. The type of criteria includes; objects updated in the last n days optionally filter the objects updated by user filter the user by project and by type (table/mappings etc.) The search dialog appears with these options, you can multi-select the object types, so for example you can select TABLE and MAPPING. Its also possible to search across projects if need be. If you have multiple users using the repository you can define the OWB user name in the 'Updated by' property to restrict the report to just that user also. Finally there is a search name that will be used for some of the options such as building a collection - this name is used for the collection to be built. In the example I have done, I've just searched my project for all process flows and mappings that users have updated in the last 7 days. The results of the query are returned in a table containing the object names, types, full path and audit details. The columns are sort-able, you can sort the results by name, type, path etc. One of the cool things here, is that you can then perform operations on these objects - such as edit them, export single selection or entire results to MDL, create a collection from the results (now you have a saved set of references in the repository, you could do deploy/export etc.), create a deployment script from the results...or even add in your own ideas! You see from this that you can do bulk operations on sets of objects based on search results. So for example selecting the 'Build Collection' option creates a collection with all of the objects from my search, you can subsequently deploy/generate/maintain this collection of objects. Under the hood of the expert if just basic OMB commands from the product and the use of the public views on the design repository. You can see how easy it is to build up macro-like capabilities that will help you do day-to-day as well as build like tasks on sets of objects.

    Read the article

  • Swap not available on System Monitor

    - by Zaki
    I had a swap partition of 1GB (RAM 1GB, Ubuntu 12.04 lts). Now swap is not shown on System Monitor neither can I hibernate my pc (sudo pm-hibernate). blkid output: /dev/sda1: UUID="B8B4FBB1B4FB706C" TYPE="ntfs" /dev/sda2: UUID="2ea7d608-2d89-4e41-9436-d05cb3ce8871" TYPE="swap" /dev/sda3: UUID="3219d03a-67e4-454b-8ce7-a27831846e35" TYPE="ext4" /dev/sda5: LABEL="Softwares" UUID="AC1CC3301CC2F47C" TYPE="ntfs" /dev/sda6: LABEL="Education" UUID="1E103E6C103E4B53" TYPE="ntfs" /dev/sda7: LABEL="Recreation" UUID="2CC8D181C8D149AA" TYPE="ntfs" /dev/sda8: LABEL="Miscellaneous" UUID="0274D6B174D6A727" TYPE="ntfs" /etc/fstab # <file system> <mount point> <type> <options> <dump> <pass> proc /proc proc nodev,noexec,nosuid 0 0 # / was on /dev/sda6 during installation UUID=3219d03a-67e4-454b-8ce7-a27831846e35 / ext4 errors=remount-ro 0 1 # swap was on /dev/sda5 during installation UUID=2ea7d608-2d89-4e41-9436-d05cb3ce8871 none swap sw 0 0 free -m total used free shared buffers cached Mem: 991 867 123 0 27 418 -/+ buffers/cache: 421 569 Swap: 0 0 0 cat /proc/swaps Filename Type Size Used Priority fdisk -l Disk /dev/sda: 160.0 GB, 160041885696 bytes 255 heads, 63 sectors/track, 19457 cylinders, total 312581808 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x9f369f36 Device Boot Start End Blocks Id System /dev/sda1 * 63 31471334 15735636 7 HPFS/NTFS/exFAT /dev/sda2 31471616 33470447 999416 82 Linux swap / Solaris /dev/sda3 33472512 62539775 14533632 83 Linux /dev/sda4 62541045 312592769 125025862+ f W95 Ext'd (LBA) /dev/sda5 62541108 125066024 31262458+ 7 HPFS/NTFS/exFAT /dev/sda6 125066088 187591004 31262458+ 7 HPFS/NTFS/exFAT /dev/sda7 187591068 250115984 31262458+ 7 HPFS/NTFS/exFAT /dev/sda8 250116048 312576704 31230328+ 7 HPFS/NTFS/exFAT swapon --all swapon: /dev/sda2: swapon failed: Invalid argument dmesg | grep -A 5 -B 5 -i swap [ 9.487404] EXT4-fs (sda3): ext4_orphan_cleanup: deleting unreferenced inode 131645 [ 9.487413] EXT4-fs (sda3): ext4_orphan_cleanup: deleting unreferenced inode 131330 [ 9.487418] EXT4-fs (sda3): 16 orphan inodes deleted [ 9.487420] EXT4-fs (sda3): recovery complete [ 9.578600] EXT4-fs (sda3): mounted filesystem with ordered data mode. Opts: (null) [ 20.580539] Swap area shorter than signature indicates [ 20.588363] IPv6: ADDRCONF(NETDEV_UP): eth0: link is not ready [ 20.619443] udevd[330]: starting version 175 [ 20.649959] lp: driver loaded but no devices found [ 20.662972] [drm] Initialized drm 1.1.0 20060810 [ 20.675515] i915 0000:00:02.0: setting latency timer to 64 -- [ 72.288573] PM: thaw of drv:sr dev:3:0:0:0 complete after 178.143 msecs [ 72.288578] PM: thaw of drv:scsi_device dev:3:0:0:0 complete after 178.136 msecs [ 72.299677] PM: thaw of drv:scsi_device dev:2:0:0:0 complete after 189.270 msecs [ 72.309473] PM: thaw of devices complete after 202.763 msecs [ 72.309668] PM: writing image. [ 72.309670] PM: Cannot find swap device, try swapon -a. [ 72.309699] PM: Cannot get swap writer [ 72.329896] Restarting tasks ... done. [ 72.331777] PM: Basic memory bitmaps freed [ 72.331792] video LNXVIDEO:00: Restoring backlight state [ 72.420048] option1 ttyUSB0: option_instat_callback: error -84 [ 72.804047] option1 ttyUSB0: option_instat_callback: error -84 -- [ 145.960625] sd 7:0:0:0: Attached scsi generic sg2 type 0 [ 145.972036] sd 7:0:0:0: [sdb] Attached SCSI removable disk [ 172.430508] PPP BSD Compression module registered [ 172.455583] PPP Deflate Compression module registered [ 332.260789] type=1400 audit(1381814763.342:27): apparmor="DENIED" operation="capable" parent=1 profile="/usr/sbin/cupsd" pid=636 comm="cupsd" pid=636 comm="cupsd" capability=36 capname="block_suspend" [ 1913.030998] Swap area shorter than signature indicates [ 2022.530155] type=1400 audit(1381816453.610:28): apparmor="DENIED" operation="capable" parent=1 profile="/usr/sbin/cupsd" pid=636 comm="cupsd" pid=636 comm="cupsd" capability=36 capname="block_suspend" [ 4062.729509] Swap area shorter than signature indicates Please help. Thanks in advance. df -h Filesystem Size Used Avail Use% Mounted on /dev/sda3 14G 6.1G 7.0G 47% / udev 488M 4.0K 488M 1% /dev tmpfs 199M 868K 198M 1% /run none 5.0M 4.0K 5.0M 1% /run/lock none 496M 224K 496M 1% /run/shm

    Read the article

  • Protecting Consolidated Data on Engineered Systems

    - by Steve Enevold
    In this time of reduced budgets and cost cutting measures in Federal, State and Local governments, the requirement to provide services continues to grow. Many agencies are looking at consolidating their infrastructure to reduce cost and meet budget goals. Oracle's engineered systems are ideal platforms for accomplishing these goals. These systems provide unparalleled performance that is ideal for running applications and databases that traditionally run on separate dedicated environments. However, putting multiple critical applications and databases in a single architecture makes security more critical. You are putting a concentrated set of sensitive data on a single system, making it a more tempting target.  The environments were previously separated by iron so now you need to provide assurance that one group, department, or application's information is not visible to other personnel or applications resident in the Exadata system. Administration of the environments requires formal separation of duties so an administrator of one application environment cannot view or negatively impact others. Also, these systems need to be in protected environments just like other critical production servers. They should be in a data center protected by physical controls, network firewalls, intrusion detection and prevention, etc Exadata also provides unique security benefits, including a reducing attack surface by minimizing packages and services to only those required. In addition to reducing the possible system areas someone may attempt to infiltrate, Exadata has the following features: 1.    Infiniband, which functions as a secure private backplane 2.    IPTables  to perform stateful packet inspection for all nodes               Cellwall implements firewall services on each cell using IPTables 3.    Hardware accelerated encryption for data at rest on storage cells Oracle is uniquely positioned to provide the security necessary for implementing Exadata because security has been a core focus since the company's beginning. In addition to the security capabilities inherent in Exadata, Oracle security products are all certified to run in an Exadata environment. Database Vault Oracle Database Vault helps organizations increase the security of existing applications and address regulatory mandates that call for separation-of-duties, least privilege and other preventive controls to ensure data integrity and data privacy. Oracle Database Vault proactively protects application data stored in the Oracle database from being accessed by privileged database users. A unique feature of Database Vault is the ability to segregate administrative tasks including when a command can be executed, or that the DBA can manage the health of the database and objects, but may not see the data Advanced Security  helps organizations comply with privacy and regulatory mandates by transparently encrypting all application data or specific sensitive columns, such as credit cards, social security numbers, or personally identifiable information (PII). By encrypting data at rest and whenever it leaves the database over the network or via backups, Oracle Advanced Security provides the most cost-effective solution for comprehensive data protection. Label Security  is a powerful and easy-to-use tool for classifying data and mediating access to data based on its classification. Designed to meet public-sector requirements for multi-level security and mandatory access control, Oracle Label Security provides a flexible framework that both government and commercial entities worldwide can use to manage access to data on a "need to know" basis in order to protect data privacy and achieve regulatory compliance  Data Masking reduces the threat of someone in the development org taking data that has been copied from production to the development environment for testing, upgrades, etc by irreversibly replacing the original sensitive data with fictitious data so that production data can be shared safely with IT developers or offshore business partners  Audit Vault and Database Firewall Oracle Audit Vault and Database Firewall serves as a critical detective and preventive control across multiple operating systems and database platforms to protect against the abuse of legitimate access to databases responsible for almost all data breaches and cyber attacks.  Consolidation, cost-savings, and performance can now be achieved without sacrificing security. The combination of built in protection and Oracle’s industry-leading data protection solutions make Exadata an ideal platform for Federal, State, and local governments and agencies.

    Read the article

  • Managing Operational Risk of Financial Services Processes – part 2/2

    - by Sanjeev Sharma
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} In my earlier blog post, I had described the factors that lead to compliance complexity of financial services processes. In this post, I will outline the business implications of the increasing process compliance complexity and the specific role of BPM in addressing the operational risk reduction objectives of regulatory compliance. First, let’s look at the business implications of increasing complexity of process compliance for financial institutions: · Increased time and cost of compliance due to duplication of effort in conforming to regulatory requirements due to process changes driven by evolving regulatory mandates, shifting business priorities or internal/external audit requirements · Delays in audit reporting due to quality issues in reconciling non-standard process KPIs and integrity concerns arising from the need to rely on multiple data sources for a given process Next, let’s consider some approaches to managing the operational risk of business processes. Financial institutions considering reducing operational risk of their processes, generally speaking, have two choices: · Rip-and-replace existing applications with new off-the shelf applications. · Extend capabilities of existing applications by modeling their data and process interactions, with other applications or user-channels, outside of the application boundary using BPM. The benefit of the first approach is that compliance with new regulatory requirements would be embedded within the boundaries of these applications. However pre-built compliance of any packaged application or custom-built application should not be mistaken as a one-shot fix for future compliance needs. The reason is that business needs and regulatory requirements inevitably out grow end-to-end capabilities of even the most comprehensive packaged or custom-built business application. Thus, processes that originally resided within the application will eventually spill outside the application boundary. It is precisely at such hand-offs between applications or between overlaying processes where vulnerabilities arise to unknown and accidental faults that potentially result in errors and lead to partial or total failure. The gist of the above argument is that processes which reside outside application boundaries, in other words, span multiple applications constitute a latent operational risk that spans the end-to-end value chain. For instance, distortion of data flowing from an account-opening application to a credit-rating system if left un-checked renders compliance with “KYC” policies void even when the “KYC” checklist was enforced at the time of data capture by the account-opening application. Oracle Business Process Management is enabling financial institutions to lower operational risk of such process ”gaps” for Financial Services processes including “Customer On-boarding”, “Quote-to-Contract”, “Deposit/Loan Origination”, “Trade Exceptions”, “Interest Claim Tracking” etc.. If you are faced with a similar challenge and need any guidance on the same feel free to drop me a note.

    Read the article

  • What is the scope of CONTEXT_INFO in SQL Server?

    - by JasonS
    I am using CONTEXT_INFO to pass a username to a delete trigger for the purposes of an audit/history table. I'm trying to understand the scope of CONTEXT_INFO and if I am creating a potential race condition. Each of my database tables has a stored proc to handle deletes. The delete stored proc takes userId as an parameter, and sets CONTEXT_INFO to the userId. My delete trigger then grabs the CONTEXT_INFO and uses that to update an audit table that indicates who deleted the row(s). The question is, if two deletes sprocs from different users are executing at the same time, can CONTEXT_INFO set in one of the sprocs be consumed by the trigger fired by the other sproc? I've seen this article http://msdn.microsoft.com/en-us/library/ms189252.aspx but I'm not clear on the scope of sessions and batches in SQL Server which is key to the article being helpful! I'd post code, but short on time at the moment. I'll edit later if this isn't clear enough. Thanks in advance for any help.

    Read the article

  • certificate issues running app in windows 7 ?

    - by Jurjen
    Hi, I'm having some problems with my App. I'm using 'org.mentalis.security' assembly to create a certificate object from a 'pfx' file, this is the line of code where the exception occurs : Certificate cert = Certificate.CreateFromPfxFile(publicKey, certificatePassword); this has always worked and still does in production, but for some reason it throws an exception when run in windows 7 (tried it on 2 machines). CertificateException : Unable to import the PFX file! [error code = -2146893792] I can't find much on this message via google, but when checking the EventViewer I get an 'Audit Failure' every time this exception occurs: Event ID = 5061 Source = Microsoft Windows Security Task Category = system Integrity Keywords = Audit Failure Cryptographic operation. Subject: Security ID: NT AUTHORITY\IUSR Account Name: IUSR Account Domain: NT AUTHORITY Logon ID: 0x3e3 Cryptographic Parameters: Provider Name: Microsoft Software Key Storage Provider Algorithm Name: Not Available. Key Name: VriendelijkeNaam Key Type: User key. Cryptographic Operation: Operation: Open Key. Return Code: 0x2 ` I'm not sure why this isn't working on win 7, I've never had problems when I was running on Vista with this. I am running VS2008 as administrator but I guess that maybe the ASP.NET user doesn't have sufficient rights or something. It's pretty strangs that the 'Algorithm name' is 'Not Available' can anyone help me with this... TIA, Jurjen de Groot

    Read the article

  • Application connection with database persist after sucessfull transaction also.

    - by anupam3m
    Hi , I am using Spring.Data.NHibernate12 on my database level.my application connection with database is not getting released. Underneath given is Dataconfiguration.xml < ?xml version="1.0" encoding="utf-8" ? < objects xmlns="http://www.springframework.net" xmlns:db="http://www.springframework.net/database" < object id="AuditLogger" type="Risco.Rsp.Ac.Audit.AuditLogger, Risco.Rsp.Ac.Audit" singleton="false" < property name="CacheSettings" ref="CacheSettings"/ < /object < object id="CacheSettings" type="Risco.Rsp.Ac.AMAC.CacheMgmt.Utilities.UpdateEntityCacheHelper, Risco.Rsp.Ac.AMAC.CacheMgmt.Utilities" singleton="false"/ < object type="Spring.Objects.Factory.Config.PropertyPlaceholderConfigurer, Spring.Core" < <property name="ConfigSections" value="databaseSettings"/> < < db:provider id="AMACDbProvider" provider="OracleClient-2.0" connectionString="Data Source=RISCODEVDB;User ID=amsbvt; Password=amsuser1234;"/ Risco.Rsp.Ac.AMAC.Mapping Risco.Rsp.Ac.Logging.Appenders Risco.Rsp.Ac.AMAC.CacheMappings --

    Read the article

  • Messages not forwarded to error queue when exception is thrown in handler (it works on my machine)

    - by darthjit
    e are using NServicebus 4.0.5 with sql server(sql server 2012) as transport. When the handler throws an exception, NSB does not retry or move the message to the error queue. Successful messages make it to the audit queue but the failed/errored ones don't! . Interestingly, all this works on our local machines(windows 7 ,sql server localdb) but not on windows server 2012 (sql server 2012). Here is the config info on the subscriber: <add name="NServiceBus/Transport" connectionString="Data Source=xxx;Initial Catalog=NServiceBus;Integrated Security=SSPI;Enlist=false;" /> <add name="NServiceBus/Persistence" connectionString="Data Source=xxx;Initial Catalog=NServiceBus;Integrated Security=SSPI;Enlist=false;" /> <MessageForwardingInCaseOfFaultConfig ErrorQueue="error" /> <UnicastBusConfig ForwardReceivedMessagesTo="audit"> <MessageEndpointMappings> <add Assembly="Services.Section.Messages" Endpoint= "Services.ACL.Worker" /> </MessageEndpointMappings> </UnicastBusConfig> And in code it is configured as follows: public class EndpointConfig : IConfigureThisEndpoint, AsA_Server, IWantCustomInitialization { public void Init() { IContainer container = ContainerInstanceProvider. GetContainerInstance(); Configure .Transactions.Enable(); Configure.With() .AutofacBuilder(container) .UseTransport<SqlServer>() .Log4Net() //.Serialization.Json() .UseNHibernateSubscriptionPersister() .UseNHibernateTimeoutPersister() .MessageForwardingInCaseOfFault() .RijndaelEncryptionService() .DefiningCommandsAs(type => type.Namespace != null &&type .Namespace.EndsWith("Commands")) .DefiningEventsAs(type => type.Namespace != null &&type .Namespace.EndsWith("Events")) .UnicastBus(); } } Any ideas on how to fix this? here is the log info (there is a lot there, search for error to see the relevant parts) https://gist.github.com/ranji/7378249

    Read the article

  • How do I execute queries upon DB connection in Rails?

    - by sycobuny
    I have certain initializing functions that I use to set up audit logging on the DB server side (ie, not rails) in PostgreSQL. At least one has to be issued (setting the current user) before inserting data into or updating any of the audited tables, or else the whole query will fail spectacularly. I can easily call these every time before running any save operation in the code, but DRY makes me think I should have the code repeated in as few places as possible, particularly since this diverges greatly from the ideal of database agnosticism. Currently I'm attempting to override ActiveRecord::Base.establish_connection in an initializer to set it up so that the queries are run as soon as I connect automatically, but it doesn't behave as I expect it to. Here is the code in the initializer: class ActiveRecord::Base # extend the class methods, not the instance methods class << self alias :old_establish_connection :establish_connection # hide the default def establish_connection(*args) ret = old_establish_connection(*args) # call the default # set up necessary session variables for audit logging # call these after calling default, to make sure conn is established 1st db = self.class.connection db.execute("SELECT SV.set('current_user', 'test@localhost')") db.execute("SELECT SV.set('audit_notes', NULL)") # end "empty variable" err ret # return the default's original value end end end puts "Loaded custom establish_connection into ActiveRecord::Base" sycobuny:~/rails$ ruby script/server = Booting WEBrick = Rails 2.3.5 application starting on http://0.0.0.0:3000 Loaded custom establish_connection into ActiveRecord::Base This doesn't give me any errors, and unfortunately I can't check what the method looks like internally (I was using ActiveRecord::Base.method(:establish_connection), but apparently that creates a new Method object each time it's called, which is seemingly worthless cause I can't check object_id for any worthwhile information and I also can't reverse the compilation). However, the code never seems to get called, because any attempt to run a save or an update on a database object fails as I predicted earlier. If this isn't a proper way to execute code immediately on connection to the database, then what is?

    Read the article

  • Oracle: Addressing Information Overload in Factory Automation

    - by [email protected]
     ORACLE's Stephen Slade has written about addressing information overload on the factory floor.  According to Slade, today's automated processes create large amounts of valuable data, but only a small percentage remains actionable.Oracle claims information overload can cost financially, as companies struggle to store and collect reams of data needed to identify embedded trends, while producing manual reports to meet quality standards, regulatory requirements and general reporting goals.Increasing scrutiny of new requirements and standards add to the need to find new ways to process data. Many companies are now using analytical engines to contextualise data into 'actionable information'. Oracle claims factories need to seriously address their data collection, audit trail and records retention processes. By organising their data, factories can maximise outcomes from excellence and contuinuous improvement programs, and gain visibility into costs int the supply chain.Analytics tools and technologies such as Business Intelligence (BI), Enterprise Manufacturing Intelligence (EMI) and Manufacturing Operations Centers (MOC) can help consolidate, contextual and distribute information.   FULL ARICLE:  http://www.myfen.com.au/news/oracle--addressing-information-overload-in-factory

    Read the article

  • Wireless driver - how to load manufacturer's STA file (Ralink 3290)

    - by Matt
    Caution: I'm a newb. Hardware: Giada i35G, cedar trail atom with nvidia gf119, railtek ethernet and Ralink 3290 for wireless. Already accomplished: Installed Ubuntu 12.10, loaded GPU drivers and redirected sound out through GPU card to HDMI. Ethernet works like a charm. Issue: Can't get my wireless up and running. There seems to be no package to which I can simply run a sudo get-aspt install ... I found the corresponding Linux driver from the manufacturer's site, but I have not managed to find out what to with the file. Here's the manufacturers site: http://www.ralinktech.com/en/04_support/support.php?sn=501 I get a file with the following name: \2012_0508_RT3290_Linux_STA_v2.6.0.0.bz2 I hope somebody might be able to tell me what to do next. Thanks for reading and apologies for potentially asking a trivial question. Best regards, Matt

    Read the article

  • Mod Puts Mac OS 7 On the Nook Touch

    - by Jason Fitzpatrick
    Thanks to a mac-hardware emulator for Android, it’s now possible to run Mac OS 7 on the Nook Touch (or other Android-based tablet). If you’ve been looking for some retro-goodness to dump on your Nook or tablet–Oregon Trail anyone?–this simple hack will certainly help. Hit up the link below for additional screenshots and more information. Mini vMac for Android Development Thread [via MikeCanex] HTG Explains: What Is Two-Factor Authentication and Should I Be Using It? HTG Explains: What Is Windows RT and What Does It Mean To Me? HTG Explains: How Windows 8′s Secure Boot Feature Works & What It Means for Linux

    Read the article

  • Linux installation on Acer Aspire One D270

    - by ronnie
    I was planning to buy Acer Aspire One D270 within a few days and as everybody installs linux on their netbook I was also planning to do that. Now, my question is how is Acer's hardware compatibility with linux and specifically in respect to the new Acer Aspire One D270. Has anybody tried installing linux on these new netbooks. It will be a great help if a D270 user can share his/her experience with linux usage. I read on some forums that there is some linux driver issue with Intel GMA 3600 and that people are not able to adjust their brightness. So, as I am a linux noob is this a major issue or not. Specs: RAM : 2Gb DDR3 Processor: Intel N2600(Cedar Trail) Graphics: Intel GMA 3600 HardDisk: 320Gb 5400 rpm

    Read the article

  • Leveraging AutoVue in Oracle's Universal Content Management for Improved Document

    AutoVue visualization, leveraged within Oracle’s Universal Content Management, makes access to technical information widely available to UCM users, allowing them to review and collaborate on CAD and engineering content in a variety of business processes and workflows. Comments and feedback are captured within the design context and recorded and tracked digitally within UCM, providing a reliable trail of decisions and approvals thereby facilitating an organization’s audit compliance. The joint solution can also be leveraged in broader Oracle applications, such as Web Center, eAM to name a few. Hear about the benefits UCM users can achieve by introducing AutoVue visualization into their UCM environment.

    Read the article

  • How do I get an Acer Aspire One D270 working?

    - by ronnie
    I was planning to buy Acer Aspire One D270 within a few days and as everybody installs linux on their netbook I was also planning to do that. Now, my question is how is Acer's hardware compatibility with linux and specifically in respect to the new Acer Aspire One D270. Has anybody tried installing linux on these new netbooks. It will be a great help if a D270 user can share his/her experience with linux usage. I read on some forums that there is some linux driver issue with Intel GMA 3600 and that people are not able to adjust their brightness. So, as I am a linux noob is this a major issue or not. Specs: RAM : 2Gb DDR3 Processor: Intel N2600(Cedar Trail) Graphics: Intel GMA 3600 HardDisk: 320Gb 5400 rpm

    Read the article

  • Mouse scroll issue after kernel build

    - by Anish S Kumar
    I have a Intel Cedar Trail netbook. For graphics to work, i had to build kernel 3.1 with the drivers. I followed the steps in this document After doing that, now my graphics is fine, but my mouse scroll does not work. Is that because I have not build the kernel properly? Have i missed selecting some options in the kernel compile menu? It will be nice if someone can help me. Also my wacom bamboo tablet is not recognized, i have installed the xserver-xorg-input-wacom drivers.

    Read the article

  • Methodology for Documenting Existing Code Base

    - by George Stocker
    I work as part of a team on an existing application that has no inline documentation, nor does it have technical documentation. As I've been working on various bug reports on the application, I've written a sort of breadcrumb trail for myself - bug numbers in various places so that the next developer can refer to that bug number to see what was going on. My question is thus: What is the most effecient method for documenting this code? Should I document as I touch the area (the virus method, if you will), or should I document from each section on its own, and not follow paths that branch out into other areas of the application? Should I insert inline comments where none previously existed (with the fear that I may end up incorrectly identifying what the code does)? What method would you use to accurately and quickly document a rather large application that has no existing inline documentation, nor inline references to external documentation?

    Read the article

  • JavaFX 2.2.3 Documentation

    - by joni g.
    JavaFX 2.2.3 and JDK 7u9 were released today. In addition to the release documentation, the following new information is provided: Learn about some of the "behind the scenes" work for an application, such as threads, events, and binding with the new learning trail on the landing page. Learn how to use cell editors with the List View component. The new example in the UI Controls tutorial shows how to build a list of names by selecting them from a combo box. Other documents were updated to reflect minor bug fixes. You can download JavaFX 2.2.3 from OTN. For all tutorials and API documentation, see http://docs.oracle.com/javafx. Other News: JavaFX Scene Builder 1.1 Developer Preview was released during the week of JavaOne and is available from OTN. This version contains support for the Linux and Mac OS X 10.8 platforms, and a preview of the new CSS Analyzer feature. See the release notes for more information.

    Read the article

  • You couldn't write it - Expired SA account

    - by GrumpyOldDBA
    This is the stuff of DBA nightmares ! email trail: Q. Can you reset the SA account on server XXXXX, we think it has expired and now no-one can work. Connect to Server: Surely no-one would set up a Server with an sa account which expires? Thankfully not. Find sa password and change connection to use SA account. Connect without issue. Me. Have checked Server and account is fine. A. Thanks that's great, you've fixed it we can all work now....(read more)

    Read the article

  • Ubuntu 12.10 live-usb won't boot

    - by user109175
    I own an Aleutia "Tango" low power PC. It uses a Intel Atom CPU N2800 @1.86GHz processor. I know there are Linux issues with the "Cedar Trail" graphics on this hardware but I can happily run Ubuntu 12.04 using the VESA graphics driver. I wanted to try out Ubuntu 12.10 so I created a live USB under 12.04 but I am unable to get this to boot. It gets as far as the Grub screen but after that the monitor shuts down. I have tried the F6 "nomodeset" option but that doesn't make any difference. Does anyone have any knowledge of this problem? Thanks in advance.

    Read the article

  • Best book for learning C++? [closed]

    - by gablin
    Possible Duplicate: Is there a good book to grok C++? I'm lacking a good book for learning C++. Although I'm not an entirely novice, I've learnt C++ mostly by trail-and-error and googling so I don't know all about best practices concerning C++. So what book or books should I get and read? Please only put 1 book per answer, for voting purposes, and maybe a short description of who it's targeted at. Hey, what happened to the community wiki option...?

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >