Search Results

Search found 27967 results on 1119 pages for 'oracle overview applications'.

Page 217/1119 | < Previous Page | 213 214 215 216 217 218 219 220 221 222 223 224  | Next Page >

  • Ibator didn't generate Oracle varchar2 field

    - by bugbug
    I have table APP_REQ_APPROVE_COMPARE with following fields: "ID" NUMBER NOT NULL ENABLE, "TRACK_NO" VARCHAR2(20 BYTE) NOT NULL ENABLE, "REQ_DATE" DATE NOT NULL ENABLE, "OFFCODE" CHAR(6 BYTE) NOT NULL ENABLE, "COMPARE_CASE_ID" NUMBER NOT NULL ENABLE, "VEHICLE_NAME" VARCHAR2(100 BYTE), "ENGINE_NO" VARCHAR2(100 BYTE), "BODY_NO" VARCHAR2(100 BYTE), "HOLD_SHIP" NUMBER, "OWNERSHIP" VARCHAR2(200 BYTE), "RENT_NAME" VARCHAR2(200 BYTE), "CONTRACT" VARCHAR2(100 BYTE), "CONTRACT_NO" VARCHAR2(100 BYTE), "CONTRACT_DATE" DATE, "ISLAWBREAKERRENT" CHAR(1 BYTE) NOT NULL ENABLE, "MISTAKE_DETAIL" VARCHAR2(4000 BYTE), "COMPARE_REASON" VARCHAR2(4000 BYTE), "CREATE_BY" NUMBER NOT NULL ENABLE, "CREATE_ON" DATE DEFAULT SYSDATE NOT NULL ENABLE, "UPDATE_BY" NUMBER, "UPDATE_ON" DATE, When I generate a java bean using Ibator , I didn't find trackNo, VehicalName, ... (all fields defined as varchar2). What is the problem in my case? Here is my Ibator configuration file: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE ibatorConfiguration PUBLIC "-//Apache Software Foundation//DTD Apache iBATIS Ibator Configuration 1.0//EN" "http://ibatis.apache.org/dtd/ibator-config_1_0.dtd"> <ibatorConfiguration> <classPathEntry location="/dos/connector/oracle_jdbc.jar"/> <ibatorContext id="autoPerson" defaultModelType="flat" targetRuntime="Ibatis2Java2"> <jdbcConnection connectionURL="jdbc:oracle:thin:@192.168.42.144:1521:orcl" driverClass="oracle.jdbc.driver.OracleDriver" userId="user" password="password"/> <javaModelGenerator targetPackage="com.ko.model" targetProject="FormConfig"> <property name="enableSubPackages" value="true"/> <property name="trimStrings" value="true"/> </javaModelGenerator> <sqlMapGenerator targetPackage="com.ko.map" targetProject="FormConfig"> <property name="enableSubPackages" value="true"/> </sqlMapGenerator> <daoGenerator targetPackage="com.ko.model.dao" type="SPRING" targetProject="FormConfig" implementationPackage="com.ko.model.dao.impl" > <property name="enableSubPackges" value="true"/> <property name="methodNameCalculator" value="extended"/> </daoGenerator> <table tableName="APP_REQ_APPROVE_COMPARE" domainObjectName="AppReqApproveCompare"/> <ibatorConfiguration>

    Read the article

  • Add new row in a databound form with a Oracle Sequence as the primary key

    - by Ranhiru
    I am connecting C# with Oracle 11g. I have a DataTable which i fill using an Oracle Data Adapter. OracleDataAdapter da; DataTable dt = new DataTable(); da = new OracleDataAdapter("SELECT * FROM Author", con); da.Fill(dt); I have few text boxes that I have databound to various rows in the data table. txtAuthorID.DataBindings.Add("Text", dt, "AUTHORID"); txtFirstName.DataBindings.Add("Text", dt, "FIRSTNAME"); txtLastName.DataBindings.Add("Text", dt, "LASTNAME"); txtAddress.DataBindings.Add("Text", dt, "ADDRESS"); txtTelephone.DataBindings.Add("Text", dt, "TELEPHONE"); txtEmailAddress.DataBindings.Add("Text", dt, "EMAIL"); I also have a DataGridView below the Text Boxes, showing the contents of the DataTable. dgvAuthor.DataSource = dt; Now when I want to add a new row, i do bm.AddNew(); where bm is defined in Form_Load as BindingManagerBase bm; bm = this.BindingContext[dt]; And when the save button is clicked after all the information is entered and validated, i do this.BindingContext[dt].EndCurrentEdit(); try { da.Update(dt); } catch (Exception ex) { MessageBox.Show(ex.Message); } However the problem comes where when I usually enter a row to the database (using SQL Plus) , I use a my_pk_sequence.nextval for the primary key. But how do i specify that when i add a new row in this method? I catch this exception ORA-01400: cannot insert NULL into ("SYSMAN".AUTHOR.AUTHORID") which is obvious because nothing was specified for the primary key. How do get around this? Thanx a lot in advance :)

    Read the article

  • Oracle doesn't remove cursors after closing result set

    - by Vladimir
    Note: we reuse single connection. ************************************************ public Connection connection() {                try {            if ((connection == null) || (connection.isClosed()))            {               if (connection!=null) log.severe("Connection was closed !");                connection = DriverManager.getConnection(jdbcURL, username, password);            }        } catch (SQLException e) {            log.severe("can't connect: " + e.getMessage());        }        return connection;            } ************************************************** public IngisObject[] select(String query, String idColumnName, String[] columns) { Connection con = connection(); Vector<IngisObject> objects = new Vector<IngisObject>(); try {     Statement stmt = con.createStatement();     String sql = query;     ResultSet rs =stmt.executeQuery(sql);//oracle increases cursors count here     while(rs.next()) {        IngisObject o = new IngisObject("New Result");        o.setIdColumnName(idColumnName);                    o.setDatabase(this);        for(String column: columns) o.attrs().put(column, rs.getObject(column));        objects.add(o);        }     rs.close();// oracle don't decrease cursor count here, while it's expected     stmt.close();     } catch (SQLException ex) {     System.out.println(query);     ex.printStackTrace(); }

    Read the article

  • How to export Oracle statistics

    - by A_M
    Hi, I am writing some new SQL queries and want to check the query plans that the Oracle query optimiser would come up with in production. My development database doesn't have anything like the data volumes of the production database. How can I export database statistics from a production database and re-import them into a development database? I don't have access to the production database, so I can't simply generate explain plans on production without going through a third party hosting organisation. This is painful. So I want a local database which is in some way representative of production on which I can try out different things. Also, this is for a legacy application. I'd like to "improve" the schema, by adding appropriate indexes. constraints, etc. I need to do this in my development database first, before rolling out to test and production. If I add an index and re-generate statistics in development, then the statistics will be generated around the development data volumes, which makes it difficult to assess the impact my changes on production. Does anyone have any tips on how to deal with this? Or is it just a case of fixing unexpected behaviour once we've discovered it on production? I do have a staging database with production volumes, but again I have to go through a third party to run queries against this, which is painful. So I'm looking for ways to cut out the middle man as much as possible. All this is using Oracle 9i. Thanks.

    Read the article

  • DBD::Oracle and utf8 issue

    - by goe
    Hi All, I have a problem where my perl code using the latest DBD::Oracle on perl v5.8.8 throws an exception on me when I try to insert characters like 'ñ'. Exception: DBD::Oracle::db do failed: ORA-01756: quoted string not properly terminated (DBD ERROR: OCIStmtPrepare) My $ENV{NLS_LANG} is set to 'AMERICAN_AMERICA.AL32UTF8' These are the DB params based on "SELECT * from NLS_DATABASE_PARAMETERS" 1 NLS_LANGUAGE AMERICAN 2 NLS_TERRITORY AMERICA 3 NLS_CURRENCY $ 4 NLS_ISO_CURRENCY AMERICA 5 NLS_NUMERIC_CHARACTERS ., 6 NLS_CHARACTERSET AL32UTF8 7 NLS_CALENDAR GREGORIAN 8 NLS_DATE_FORMAT DD-MON-RR 9 NLS_DATE_LANGUAGE AMERICAN 10 NLS_SORT BINARY 11 NLS_TIME_FORMAT HH.MI.SSXFF AM 12 NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM 13 NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR 14 NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR 15 NLS_DUAL_CURRENCY $ 16 NLS_COMP BINARY 17 NLS_LENGTH_SEMANTICS BYTE These are perl params based on "$db-ora_nls_parameters()" $VAR1 = { 'NLS_LANGUAGE' => 'AMERICAN', 'NLS_TIME_TZ_FORMAT' => 'HH.MI.SSXFF AM TZR', 'NLS_SORT' => 'BINARY', 'NLS_NUMERIC_CHARACTERS' => '.,', 'NLS_TIME_FORMAT' => 'HH.MI.SSXFF AM', 'NLS_ISO_CURRENCY' => 'AMERICA', 'NLS_COMP' => 'BINARY', 'NLS_CALENDAR' => 'GREGORIAN', 'NLS_DATE_FORMAT' => 'DD-MON-RR', 'NLS_DATE_LANGUAGE' => 'AMERICAN', 'NLS_TIMESTAMP_FORMAT' => 'DD-MON-RR HH.MI.SSXFF AM', 'NLS_TERRITORY' => 'AMERICA', 'NLS_LENGTH_SEMANTICS' => 'BYTE', 'NLS_NCHAR_CHARACTERSET' => 'AL16UTF16', 'NLS_DUAL_CURRENCY' => '$', 'NLS_TIMESTAMP_TZ_FORMAT' => 'DD-MON-RR HH.MI.SSXFF AM TZR', 'NLS_NCHAR_CONV_EXCP' => 'FALSE', 'NLS_CHARACTERSET' => 'AL32UTF8', 'NLS_CURRENCY' => '$' }; Here are some other strange facts: If I set NLS_LANG to ‘'AMERICAN_AMERICA.UTF8’ the insert executes fine with ‘ñ’ character. If I leave NLS_LANG as ‘'AMERICAN_AMERICA.AL32UTF8' but use ‘Ñ’ the insert will run fine as well.

    Read the article

  • Distributed Cache with Serialized File as DataStore in Oracle Coherence

    - by user226295
    Weired but I am investigating the Oracle Coherence as a substitue for distribute cache. My primarr problem is that we dont have distribituted cache as such as of now in our app. Thats my major concern. And thats what I want to implement. So, lets say if I take up a machine and start a new (3rd) reading process, it will be able to connect to the cache and listen to the cache and will have a full set of cache triplicated (as of now its duplicated) Now thats waste from a common person stanpoint too. The size of the cache is 2 GB and without going distibuted its limiting us. Thats bring me to Coheremce. But now, we dont have database as persistent store too. we have the archival processes as our persistent store. (90 days worth of data) Ok now multiply that with soem where around 2 GB * 90 (thats the bare minimum we want to keep). Preliminary/Intermediate analysis of Coherence as a solution. And a (supposedly) brilliant thought crossed my mind. Why not have this as persistant storage with my distributed cache. Does Oracle Coherence support that. I will get rid of archiving infrastructure too (i hate daemon archiving processes). For some starnge reasons, I dont wanna go to the DB to replace those flat files. What say?, can Coherence be my savior? Any other stable alternate too. (Coherence is imposed on me by big guys, FYI)

    Read the article

  • Batch insert mode with hibernate and oracle: seems to be dropping back to slow mode silently

    - by Chris
    I'm trying to get a batch insert working with Hibernate into Oracle, according to what i've read here: http://docs.jboss.org/hibernate/core/3.3/reference/en/html/batch.html , but with my benchmarking it doesn't seem any faster than before. Can anyone suggest a way to prove whether hibernate is using batch mode or not? I hear that there are numerous reasons why it may silently drop into normal mode (eg associations and generated ids) so is there some way to find out why it has gone non-batch? My hibernate.cfg.xml contains this line which i believe is all i need to enable batch mode: <property name="jdbc.batch_size">50</property> My insert code looks like this: List<LogEntry> entries = ..a list of 100 LogEntry data classes... Session sess = sessionFactory.getCurrentSession(); for(LogEntry e : entries) { sess.save(e); } sess.flush(); sess.clear(); My 'logentry' class has no associations, the only interesting field is the id: @Entity @Table(name="log_entries") public class LogEntry { @Id @GeneratedValue public Long id; ..other fields - strings and ints... However, since it is oracle, i believe the @GeneratedValue will use the sequence generator. And i believe that only the 'identity' generator will stop bulk inserts. So if anyone can explain why it isn't running in batch mode, or how i can find out for sure if it is or isn't in batch mode, or find out why hibernate is silently dropping back to slow mode, i'd be most grateful. Thanks

    Read the article

  • Invoking a SOAP ( Web Services ) from ORACLE DB

    - by Mousarules
    Dears, Kindly note that I’m trying to invoke a SOAP (web services) from ORACLE DB using pl\sql , after I have done some investigations it says that I have to use the UTL_HTTP package but It didn't work with me !!! Kindly to advice me , where should I exactly place the following SOAP in pl\SQL to be invoked .... is it posible ? SOAP 1.1 The following is a sample SOAP 1.1 request and response. The placeholders shown need to be replaced with actual values. POST /gmgwebservice/service.asmx HTTP/1.1 Host: bulk.umniah.com Content-Type: text/xml; charset=utf-8 Content-Length: length SOAPAction: "http://tempuri.org/SendSMS" <SendSMS xmlns="http://tempuri.org/"> <UserName>string</UserName> <Password>string</Password> <MessageBody>string</MessageBody> <Sender>string</Sender> <Destination>string</Destination> </SendSMS> HTTP/1.1 200 OK Content-Type: text/xml; charset=utf-8 Content-Length: length <SendSMSResponse xmlns="http://tempuri.org/"> <SendSMSResult>string</SendSMSResult> </SendSMSResponse> --This web services refers to a web site called Bulk Messaging ; the web site sends SMS to a specific mobile number by filling in some text boxes , I need it to be done from ORACLE forms when a specific action occurs ( JOB ) but I don’t know how to use it inside my pl\sql code . Hope that it’s clear ,is there something else I have to mention ?

    Read the article

  • Oracle clients dead wait

    - by Macroideal
    hi all friends I meet a problem yesterday. Maybe it's because it is April 1st... but it did exist. I have 3 PCs in remote area, two clients and one oracle server. My app is running separately in the two clients, connecting hourly to the oracle database. My clients worked well before April 1st, but suddenly my app in the client machines went down. Firstly, I did not change any configurations. I used libsqlora8 to connect to the server. I went into a dead loop in the library. I tried sqlplus, but it is dead there in my shell terminal, like it meets an infinite loop: no return until i pressed ctrl + c. The reason I guess is an "infinite loop" somewhere. BTW, when I used my local PC to connect the server, it worked well. Just from this phenomenon, we can see the problem lies in the client machine. I checked the configuration file both in local machine and client machines -they are identical Have you met a problem like this? I hope it's not due to April 1st.

    Read the article

  • Building vs. Buying a Master Data Management Solution

    - by david.butler(at)oracle.com
    Many organizations prefer to build their own MDM solutions. The argument is that they know their data quality issues and their data better than anyone. Plus a focused solution will cost less in the long run then a vendor supplied general purpose product. This is not unreasonable if you think of MDM as a point solution for a particular data quality problem. But this approach carries significant risk. We now know that organizations achieve significant competitive advantages when they deploy MDM as a strategic enterprise wide solution: with the most common best practice being to deploy a tactical MDM solution and grow it into a full information architecture. A build your own approach most certainly will not scale to a larger architecture unless it is done correctly with the larger solution in mind. It is possible to build a home grown point MDM solution in such a way that it will dovetail into broader MDM architectures. A very good place to start is to use the same basic technologies that Oracle uses to build its own MDM solutions. Start with the Oracle 11g database to create a flexible, extensible and open data model to hold the master data and all needed attributes. The Oracle database is the most flexible, highly available and scalable database system on the market. With its Real Application Clusters (RAC) it can even support the mixed OLTP and BI workloads that represent typical MDM data access profiles. Use Oracle Data Integration (ODI) for batch data movement between applications, MDM data stores, and the BI layer. Use Oracle Golden Gate for more real-time data movement. Use Oracle's SOA Suite for application integration with its: BPEL Process Manager to orchestrate MDM connections to business processes; Identity Management for managing users; WS Manager for managing web services; Business Intelligence Enterprise Edition for analytics; and JDeveloper for creating or extending the MDM management application. Oracle utilizes these technologies to build its MDM Hubs.  Customers who build their own MDM solution using these components will easily migrate to Oracle provided MDM solutions when the home grown solution runs out of gas. But, even with a full stack of open flexible MDM technologies, creating a robust MDM application can be a daunting task. For example, a basic MDM solution will need: a set of data access methods that support master data as a service as well as direct real time access as well as batch loads and extracts; a data migration service for initial loads and periodic updates; a metadata management capability for items such as business entity matrixed relationships and hierarchies; a source system management capability to fully cross-reference business objects and to satisfy seemingly conflicting data ownership requirements; a data quality function that can find and eliminate duplicate data while insuring correct data attribute survivorship; a set of data quality functions that can manage structured and unstructured data; a data quality interface to assist with preventing new errors from entering the system even when data entry is outside the MDM application itself; a continuing data cleansing function to keep the data up to date; an internal triggering mechanism to create and deploy change information to all connected systems; a comprehensive role based data security system to control and monitor data access, update rights, and maintain change history; a flexible business rules engine for managing master data processes such as privacy and data movement; a user interface to support casual users and data stewards; a business intelligence structure to support profiling, compliance, and business performance indicators; and an analytical foundation for directly analyzing master data. Oracle's pre-built MDM Hub solutions are full-featured 3-tier Internet applications designed to participate in the full Oracle technology stack or to run independently in other open IT SOA environments. Building MDM solutions from scratch can take years. Oracle's pre-built MDM solutions can bring quality data to the enterprise in a matter of months. But if you must build, at lease build with the world's best technology stack in a way that simplifies the eventual upgrade to Oracle MDM and to the full enterprise wide information architecture that it enables.

    Read the article

  • Oracle SQL Developer v3.2.1 Now Available

    - by thatjeffsmith
    Oracle SQL Developer version 3.2.1 is now available. I recommend that everyone now upgrade to this release. It features more than 200 bug fixes, tweaks, and polish applied to the 3.2 edition. The high profile bug fixes submitted by customers and users on our forums are listed in all their glory for your review. I want to highlight a few of the changes though, as I recognize many of you lack the time and/or patience to ‘read the docs.’ That would include me, which is why I enjoy writing these kinds of blog posts. I’m lazy – just like you! No more artificial line breaks between CREATE OR REPLACE and your PL/SQL In versions 3.2 and older, when you pull up your stored procedural objects in our editor, you would see a line break inserted between the CREATE OR REPLACE and then the body of your code. In version 3.2.1, we have removed the line break. 3.1 3.2.1 Trivia Did You Know? The database doesn’t store the ‘CREATE’ or ‘CREATE OR REPLACE’ bit of your PL/SQL code in the database. If we look at the USER_SOURCE view, we can see that the code begins with the object name. So the CREATE OR REPLACE bit is ‘artificial’ The intent is to give you the code necessary to recreate your object – and have it ‘compile’ into the database. We pretty much HAVE to add the ‘CREATE OR REPLACE.’ From now on it will appear inline with the first line of your code. Exporting Tables & Views When exporting data from your tables or views, previous versions of SQL Developer presented a 3 step wizard. It allows you to choose your columns and apply data filters for what is exported. This was kind of redundant. The grids already allowed you to select your columns and apply filters. Wouldn’t it be more intuitive AND efficient to just make the grids behave in a What You See Is What You Get (WYSIWYG) fashion? In version 3.2.1, that is exactly what will happen. The wizard now only has two steps and the grid will export the data and columns as defined in the visible grid. Let the grid properties define what is actually exported! And here is what is pasted into my worksheet: "BREWERY"|"CITY" "3 Brewers Restaurant Micro-Brewery"|"Toronto" "Amsterdam Brewing Co."|"Toronto" "Ball Brewing Company Ltd."|"Toronto" "Big Ram Brewing Company"|"Toronto" "Black Creek Historic Brewery"|"Toronto" "Black Oak Brewing"|"Toronto" "C'est What?"|"Toronto" "Cool Beer Brewing Company"|"Toronto" "Denison's Brewing"|"Toronto" "Duggan's Brewery"|"Toronto" "Feathers"|"Toronto" "Fermentations! - Danforth"|"Toronto" "Fermentations! - Mount Pleasant"|"Toronto" "Granite Brewery & Restaurant"|"Toronto" "Labatt's Breweries of Canada"|"Toronto" "Mill Street Brew Pub"|"Toronto" "Mill Street Brewery"|"Toronto" "Molson Breweries of Canada"|"Toronto" "Molson Brewery at Air Canada Centre"|"Toronto" "Pioneer Brewery Ltd."|"Toronto" "Post-Production Bistro"|"Toronto" "Rotterdam Brewing"|"Toronto" "Steam Whistle Brewing"|"Toronto" "Strand Brasserie"|"Toronto" "Upper Canada Brewing"|"Toronto" JUST what I wanted And One Last Thing Speaking of export, sometimes I want to send data to Excel. And sometimes I want to send multiple objects to Excel – to a single Excel file that is. In version 3.2.1 you can now do that. Let’s export the bulk of the HR schema to Excel, with each table going to it’s own workbook in the same worksheet. Select many tables, put them in in a single Excel worksheet If you try this in previous versions of SQL Developer it will just write the first table to the Excel file. This is one of the bugs we addressed in v3.2.1. Here is what the output Excel file looks like now: Many tables - Many workbooks in an Excel Worksheet I have a sneaky suspicion that this will be a frequently used feature going forward. Excel seems to be the cornerstone of many of our popular features. Imagine that!

    Read the article

  • OS Analytics with Oracle Enterprise Manager (by Eran Steiner)

    - by Zeynep Koch
    Oracle Enterprise Manager Ops Center provides a feature called "OS Analytics". This feature allows you to get a better understanding of how the Operating System is being utilized. You can research the historical usage as well as real time data. This post will show how you can benefit from OS Analytics and how it works behind the scenes. The recording of our call to discuss this blog is available here: https://oracleconferencing.webex.com/oracleconferencing/ldr.php?AT=pb&SP=MC&rID=71517797&rKey=4ec9d4a3508564b3Download the presentation here See also: Blog about Alert Monitoring and Problem Notification Blog about Using Operational Profiles to Install Packages and other content Here is quick summary of what you can do with OS Analytics in Ops Center: View historical charts and real time value of CPU, memory, network and disk utilization Find the top CPU and Memory processes in real time or at a certain historical day Determine proper monitoring thresholds based on historical data Drill down into a process details Where to start To start with OS Analytics, choose the OS asset in the tree and click the Analytics tab. You can see the CPU utilization, Memory utilization and Network utilization, along with the current real time top 5 processes in each category (click the image to see a larger version):  In the above screen, you can click each of the top 5 processes to see a more detailed view of that process. Here is an example of one of the processes: One of the cool things is that you can see the process tree for this process along with some port binding and open file descriptors. Next, click the "Processes" tab to see real time information of all the processes on the machine: An interesting column is the "Target" column. If you configured Ops Center to work with Enterprise Manager Cloud Control, then the two products will talk to each other and Ops Center will display the correlated target from Cloud Control in this table. If you are only using Ops Center - this column will remain empty. The "Threshold" tab is particularly helpful - you can view historical trends of different monitored values and based on the graph - determine what the monitoring values should be: You can ask Ops Center to suggest monitoring levels based on the historical values or you can set your own. The different colors in the graph represent the current set levels: Red for critical, Yellow for warning and Blue for Information, allowing you to quickly see how they're positioned against real data. It's important to note that when looking at longer periods, Ops Center smooths out the data and uses averages. So when looking at values such as CPU Usage, try shorter time frames which are more detailed, such as one hour or one day. Applying new monitoring values When first applying new values to monitored attributes - a popup will come up asking if it's OK to get you out of the current Monitoring Policy. This is OK if you want to either have custom monitoring for a specific machine, or if you want to use this current machine as a "Gold image" and extract a Monitoring Policy from it. You can later apply the new Monitoring Policy to other machines and also set it as a default Monitoring Profile. Once you're done with applying the different monitoring values, you can review and change them in the "Monitoring" tab. You can also click the "Extract a Monitoring Policy" in the actions pane on the right to save all the new values to a new Monitoring Policy, which can then be found under "Plan Management" -> "Monitoring Policies". Visiting the past Under the "History" tab you can "go back in time". This is very helpful when you know that a machine was busy a few hours ago (perhaps in the middle of the night?), but you were not around to take a look at it in real time. Here's a view into yesterday's data on one of the machines: You can see an interesting CPU spike happening at around 3:30 am along with some memory use. In the bottom table you can see the top 5 CPU and Memory consumers at the requested time. Very quickly you can see that this spike is related to the Solaris 11 IPS repository synchronization process using the "pkgrecv" command. The "time machine" doesn't stop here - you can also view historical data to determine which of the zones was the busiest at a given time: Under the hood The data collected is stored on each of the agents under /var/opt/sun/xvm/analytics/historical/ An "os.zip" file exists for the main OS. Inside you will find many small text files, named after the Epoch time stamp in which they were taken If you have any zones, there will be a file called "guests.zip" containing the same small files for all the zones, as well as a folder with the name of the zone along with "os.zip" in it If this is the Enterprise Controller or the Proxy Controller, you will have folders called "proxy" and "sat" in which you will find the "os.zip" for that controller The actual script collecting the data can be viewed for debugging purposes as well: On Linux, the location is: /opt/sun/xvmoc/private/os_analytics/collect If you would like to redirect all the standard error into a file for debugging, touch the following file and the output will go into it: # touch /tmp/.collect.stderr   The temporary data is collected under /var/opt/sun/xvm/analytics/.collectdb until it is zipped. If you would like to review the properties for the Analytics, you can view those per each agent in /opt/sun/n1gc/lib/XVM.properties. Find the section "Analytics configurable properties for OS and VSC" to view the Analytics specific values. I hope you find this helpful! Please post questions in the comments below. Eran Steiner

    Read the article

  • Date Tracking in Oracle HRMS

    - by Manoj Madhusoodanan
    Update Date Track Modes To maintain employee data effectively Oracle HCM is using a mechanism called date tracking.The main motive behind the date track mode is to maintain past,present and future data effectively.The various update date track modes are: CORRECTION : Over writes the data. No history will maintain.UPDATE : Keeps the history and new change will effect as of effective dateUPDATE_CHANGE_INSERT : Inserts the record and preserves the futureUPDATE_OVERRIDE : Inserts the record and overrides the future Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Action: Created Employee # 22 on 01-JAN-2012 The record in PER_ALL_PEOPLE_F is as shown below. Effective Start Date Effective End Date Employee Number Marital Status Object Version Number 01-JAN-2012 31-DEC-4712 24 2 Action: Updated record in CORRECTION mode Effective Start Date Effective End Date Employee Number Marital Status Object Version Number 01-JAN-2012 31-DEC-4712 24 Single 3 Action: Updated record in UPDATE mode effective 01-JUN-2012 and Marital Status = Married Effective Start Date Effective End Date Employee Number Marital Status Object Version Number 01-JAN-2012 31-MAY-2012 24 Single 4 01-JUN-2012 31-DEC-4712 24 Married 5 Action: Updated record in UPDATE mode effective 01-SEP-2012 and Marital Status = Divorced Effective Start Date Effective End Date Employee Number Marital Status Object Version Number 01-JAN-2012 31-MAY-2012 24 Single 4 01-JUN-2012 31-AUG-2012 24 Married 6 01-SEP-2012 31-DEC-4712 24 Divorced 7 Action: Updated record in UPDATE_CHANGE_INSERT mode effective 01-MAR-2012 and Marital Status = Living Together Effective Start Date Effective End Date Employee Number Marital Status Object Version Number 01-JAN-2012 29-FEB-2012 24 Single 8 01-MAR-2012 31-MAY-2012 24 Living Together 9 01-JUN-2012 31-AUG-2012 24 Married 6 01-SEP-2012 31-DEC-4712 24 Divorced 7 Action: Updated record in UPDATE_OVERRIDE mode effective 01-AUG-2012 and Marital Status = Divorced Effective Start Date Effective End Date Employee Number Marital Status Object Version Number 01-JAN-2012 29-FEB-2012 24 Single 8 01-MAR-2012 31-MAY-2012 24 Living Together 9 01-JUN-2012 31-JUL-2012 24 Married 10 01-AUG-2012 31-DEC-4712 24 Divorced 11  Delete Date Track Modes The various delete date track modes are ZAP : wipes all recordsDELETE : Deletes  current recordFUTURE_CHANGE : Deletes current and future changes.DELETE_NEXT_CHANGE : Deletes next change Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Element Entry records are shown below. Effective Start Date Effective End Date Element Entry Id Object Version Number 01-JAN-2012 12-OCT-2012 129831 3 13-OCT-2012 19-OCT-2012 129831 5 20-OCT-2012 31-DEC-4712 129831 6 Action: Delete record in ZAP mode effective 14-JAN-2012 No rows Action: Delete record in DELETE mode effective 14-OCT-2012 Effective Start Date Effective End Date Element Entry Id Object Version Number 01-JAN-2012 12-OCT-2012 129831 3 13-OCT-2012 14-OCT-2012 129831 6 Action: Delete record in FUTURE_CHANGE mode effective 14-JAN-2012 Effective Start Date Effective End Date Element Entry Id Object Version Number 01-JAN-2012 31-DEC-4712 129831 4 Action: Delete record in NEXT_CHANGE mode effective 14-JAN-2012 Effective Start Date Effective End Date Element Entry Id Object Version Number 01-JAN-2012 19-OCT-2012 129831 4 20-OCT-2012 31-DEC-4712 129831 6

    Read the article

  • ERROR: Not enough space?

    - by dsmoljanovic
    Now this is a very unspecific question. I'm trying to figure out what this message would mean. Here is the story behind it: I'm installing Oracle enterprise manager cloud control (12c r3) on Solaris 10 (5/09). Installer opens up, i enter all needed information and at the last step click Install. It immediately crashes with only "ERROR: Not enough space" written in log and console and nothing else. Now, this could be java error or Solaris error? I'm thinking it's happening either when it starts to copy files or when it tries to launch a process that would do that. What space is it referring to? disk (have ehough), swap (also), memory (yep)... Any ideas are helpful. Edit: i found this exception in the oraInventory logs: oracle.sysman.oii.oiic.OiicInstallAPIException: Not enough space at oracle.sysman.oii.oiic.OiicAPIInstaller.initInstallSession(OiicAPIInstaller.java:2165) at oracle.sysman.oii.oiic.OiicAPIInstaller.initOUIAPISession(OiicAPIInstaller.java:790) at oracle.sysman.install.oneclick.EMGCOUIInstaller.prepareForInstall(EMGCOUIInstaller.java:676) at oracle.sysman.install.oneclick.EMGCSummaryDlgonNext$1.run(EMGCSummaryDlgonNext.java:243) at java.lang.Thread.run(Thread.java:662) at oracle.sysman.install.oneclick.EMGCSummaryDlgonNext.actionsOnClickofNext(EMGCSummaryDlgonNext.java:1067) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at oracle.sysman.install.oneclick.EMGCUtil.performonClickOfNextForClass(EMGCUtil.java:399) at oracle.sysman.install.oneclick.EMGCUtil.performPageLevelValidationsForSilentInstall(EMGCUtil.java:367) at oracle.sysman.install.oneclick.EMGCInstaller.prepareForSilentInstall(EMGCInstaller.java:1459) at oracle.sysman.install.oneclick.EMGCInstaller.main(EMGCInstaller.java:1553) disk status: bash-3.00$ df -h /tmp Filesystem size used avail capacity Mounted on swap 8.1G 2.7G 5.4G 33% /tmp bash-3.00$ df -h /u01 Filesystem size used avail capacity Mounted on / 275G 28G 244G 11% / swap: root@gs12emcc # swap -s total: 18306040k bytes allocated + 3837808k reserved = 22143848k used, 5712664k available

    Read the article

  • Il CRM è al passo con i tempi?

    - by antonella.buonagurio(at)oracle.com
    Il Social Customer Relationship Management è nato grazie alla rivoluzione portata dal Web 2.0, un cambiamento epocale nelle modalità di comunicazione che ha aggiunto una incredibile ricchezza alle conversazioni tra aziende e consumatori. Le aziende dispongono adesso di strumenti per comprendere il proprio mercato senza precedenti, i consumatori, a loro volta, hanno il potere di utilizzare nuovi canali per esprimere le proprie esigenze e per comunicare e condividere commenti ed esperienze. Ma il Web 2.0 non è il solo fattore che impatta sulle scelte strategiche in ambito CRM  che ogni azienda deve considerare per sostenere  questo nuovo rapporto con i propri consumatori.    Vuoi scoprire quali sono le forze (o fattori) che le aziende devono considerare affinchè i processi di gestione della relazione con i clienti stiano al passo con le mutate condizioni sociali ed economiche?   Per saperne di più:   Il whitepaper realizzato da Oracle, Paul Gillin ed  IT Business Edge  ne delinea alcuni: 1.      Il Business. Come è cambiato in funzione dell'esperienza multicanale ora possible, della centralità del cliente e dei social networking che dominano le relazioni on line? 2.      La tecnologiaLe aziende oggi per guadagnare vantaggio competitivo devono dotarsi delle più innovative tecnologie per dare maggior valore al proprio business e per ridurre al minimo i costi di infrastruttura. Quali sono e quali sono gli effettivi vantaggi?   e altri ancora ...... leggendo il white paper "Is your CRM solution keeping up with the times?"

    Read the article

  • New User of UPK?

    - by [email protected]
    The UPK Developer comes with a variety of manuals to help support your organization in the development and deployment of content. The Developer manuals can be found in the \Documentation\Language Code\Reference folder where the Developer has been installed. As of 3.5.x the documentation can also be accessed via the Start menu, Start\Programs\User Productivity Kit\Documentation\Reference. Content Deployment.pdf: This manual provides information on how to deploy content to your audience. Content Development.pdf: This manual provides information on how to create, maintain, and publish content using the Developer. The content of this manual also appears in the Developer help system. Content Player.pdf: This manual provides instructions on how to view content using the Player. The content of this manual also appears in the Player help system. In-Application Support Guide.pdf: This manual provides information on how implement content-sensitive, in-application support for enterprise applications using Player content. Installation & Administration.pdf: This manual provides instructions for installing the Developer in a single-user or multi-user environment as well as information on how to add and manage users and content in a multi-user installation. An Administration help system also appears in the Developer for authors configured as administrators. This manual also provides instructions for installing and configuring Usage Tracking. Upgrade.pdf: This manual provides information on how to upgrade from a previous version to the current version. Usage Tracking Administration & Reporting.pdf: This manual provides instructions on how to manage users and usage tracking reports. - Kathryn Lustenberger, Oracle UPK Outbound Product Management

    Read the article

  • The Importance of Collaboration, Analytics, and Mobile Technologies for Modern HR

    - by HCM-Oracle
    It was 17 years ago, when a McKinsey study uncovered the “war for talent”. Today, it is no point of contention that a strong talent-centric strategy maybe the most important focus for organizations. A talent-centric organization aims at recruiting, retaining and developing the best talent.  The best employees will be able to adapt responsibilities and be able to come up with solutions to solve problems, which are important skills in today’s dynamic work environment, and arguably more important in this recessionary climate.   The notion of hiring and retaining talented employees for organizational sustainability and competitive advantage is not a new concept. But can organizations consider themselves as having a “talent-centric” strategy without up-to-date collaboration tools, HR analytics and mobile technologies in pursuit of attracting, hiring and retaining the best talent? Attend the Upcoming Webcast A webcast on June 19th at 3pm EST will reveal more results of the study. Based on original research done in collaboration between Oracle HCM and HCI, we unveil new findings that explore how critical collaboration, analytic insights and mobile technology are for supporting a talent-centric work environment. You will learn: What are the benefits to being talent-centric? How does collaboration via social networks, analytics with predictive insights and mobile technologies support the talent-centric strategy of an organization? What is the state of play for these technologies? Register Here 

    Read the article

  • Come integrare in modo smart processi di vendita e produzione?

    - by Claudia Caramelli-Oracle
    L’innovazione tecnologica ha trasformato il modo in cui i clienti interagiscono con le aziende. Inoltre, gli attuali scenari di mercato richiedono attenzione ed efficacia nella vendita per mantenere massima competitività. Per ottenere le migliori performance di vendita è necessario accelerare e automatizzare i processi di scambio informazioni tra i dipartimenti commerciali e produttivi, minimizzando tempi di attesa per ottenere dati tecnici e autorizzazioni alla fattibilità, riducendo i colli di bottiglia e i possibili errori umani attraverso un processo di controllo e omologazione dell’offerta.Gli sponsor dell’evento ti attendono l'11 giugno presso la prestigiosa sede dell’Unione Industriale di Torino per scoprire come: Ridurre il ciclo di vendita, facendo efficienza sull’intero processo di vendita Minimizzare gli impatti da turnover del personale di vendita Migliorare il value to promise Ottenere una migliore fidelizzazione e soddisfazione dei propri clienti, riducendone lo switching Assistere dal vivo ad una dimostrazione pratica di Oracle, leader mondiale nell’ambito delle soluzioni di CPQ (Configure, Price and Quoting) nell’utilizzo di uno strumento veloce, facile da utilizzare, che permetta una gestione smart della configurazione commerciale dell’offerta B2B anche con l’ausilio di accesso mobile e cruscotti direzionali. Scoprire come altre aziende abbiano adottato con successo queste soluzioni di business. La partecipazione all'evento è gratuita ma con capienza limitata, iscriviti subito per assicurarti la partecipazione: CLICCA QUI per registrarti. Se hai bisogno di maggiori informazioni scrivi a Silvia Valgoi.

    Read the article

  • EBS Seed Data Comparison Reports Now Available

    - by Steven Chan (Oracle Development)
    Earlier this year we released a reporting tool that reports on the differences in E-Business Suite database objects between one release and another.  That's a very useful reference, but EBS defaults are delivered as seed data within the database objects themselves. What about the differences in this seed data between one release and another? I'm pleased to announce the availability of a new tool that provides comparison reports of E-Business Suite seed data between EBS 11.5.10.2, 12.0.4, 12.0.6, 12.1.1, and 12.1.3.  This new tool complements the information in the data model comparison tool.  You can download the new seed data comparison tool here: EBS ATG Seed Data Comparison Report (Note 1327399.1) The EBS ATG Seed Data Comparison Report provides report on the changes between different EBS releases based upon the seed data changes delivered by the product data loader files (.ldt extension) based on EBS ATG loader control (.lct extension) files.  You can use this new tool to report on the differences in the following types of seed data: Concurrent Program definitions Descriptive Flexfield entity definitions Application Object Library profile option definitions Application Object Library (AOL) key flexfield, function, lookups, value set definitions Application Object Library (AOL) menu and responsibility definitions Application Object Library messages Application Object Library request set definitions Application Object Library printer styles definitions Report Manager / WebADI component and integrator entity definitions Business Intelligence Publisher (BI Publisher) entity definitions BIS Request Set Generator entity definitions ... and more Your feedback is welcomeThis new tool was produced by our hard-working EBS Release Management team, and they're actively seeking your feedback.  Please feel free to share your experiences with it by posting a comment here.  You can also request enhancements to this tool via the distribution list address included in Note 1327399.1.Related Articles Oracle E-Business Suite Release 12.1.3 Now Available New Whitepaper: Upgrading EBS 11i Forms + OA Framework Personalizations to EBS 12 EBS 12.0 Minimum Requirements for Extended Support Finalized Five Key Resources for Upgrading to E-Business Suite Release 12 E-Business Suite Release 12.1.1 Consolidated Upgrade Patch 1 Now Available New Whitepaper: Planning Your E-Business Suite Upgrade from Release 11i to 12.1

    Read the article

  • How to start WebLogic Server using default scripts?

    - by Luz Mestre-Oracle
    There are a few common issues reported when starting weblogic server using scripts. 1. User is not able to access weblogic console. 2. After a few days/hours weblogic server stops abruptly. 3. When user closes putty, they are not able to connect to weblogic server anymore. 4. When user closes windows command prompt, they are not able to connect to weblogic server anymore. 5. Weblogic is started using startManagedWebLogic.cmd/startManagedWebLogic.sh. By default, WebLogic Server does not run in background mode, so after you close the window the process finishes as well. In Linux/Unix based platforms, you need to use: nohup ./startManagedWebLogic.sh <Server> <URL> & In Windows platforms, you need to start Managed Servers using Windows Services: How to Install MS Windows Services For FMW 11g WebLogic Domain Admin and Managed Servers (Doc ID 1060058.1) http://docs.oracle.com/cd/E23943_01/web.1111/e13708/winservice.htm There a few more reasons that could cause similar symptoms, like JVM crash, signals sent by the Operating System, and many other reasons.  But the above steps is the first one to start. Enjoy!

    Read the article

  • Social Media Stations for Partners

    - by Oracle OpenWorld Blog Team
    By Stephanie Spada One of our exciting additions to this year’s Oracle Partner Network Exchange @ OpenWorld are Social Media Stations.  Partners have the opportunity to get customized, face-to-face expert advice on how they can better engage their customers and find new prospects online using social media tools.When: Sunday, September 30Time: 3:00 p.m.–5:00 p.m.Where: Moscone South, Esplanade levelWhen: Monday, October 1Time:  9:30 a.m.–6:00 p.m.Where: Moscone South, OPN Lounge, Exhibitor levelEach customized social media consultation will take only 25 minutes. Here’s how it works:·    Partners check in with a Social Media Rally coordinator who will assess needs and make the right connections for each session·    Partners go to the Photo Station, where a headshot will be taken that can be used on social profiles, Websites or for articles and posts across the Web·    Partners meet with the One-2-One consultants who will walk them through how they’re using social media today and what next steps could beSocial media channels/methods discussed can include Google+, Google Alerts, Google Analytics, Facebook, LinkedIn, Search Engine Optimization, Twitter, and more.  With so many choices, partners can decide how to focus their time.To get the most out of the Social Media Stations, partners should:·    Wear appropriate attire for the headshot photo·    Bring log-in information for social platforms they want to discuss·    Come prepared with questions for the One-2-One consultation so session time can be maximizedFor questions, or to schedule a session ahead of time, partners should send an email to: [email protected].

    Read the article

  • Procurement and E-Business Suite Product Analyzers .. Can you use this tool to resolve your SR?

    - by LindaJ-Oracle
    Procurement and E-Business Suite Product Analyzers (Doc ID 1545562.1). Analyzers are Query/Read only tools with easy to read html output. The tools are delivered by EBS Support via My Oracle Support documents ids for ease of use. The Analyzer scripts are meant to be part of your Production maintenance program by your Sysadmin, or to designated end users. The result set is an easy to read html output that provides recommendations, solutions and early warnings to of items that should be reviewed and correct. Each analyzer can be ran on demand or scheduled for repeatability and emailed to critical reviewers. There are several Analyzers available for E-Business Suite Applications Technology Group, Financials, and Manufacturing including some of the following topics.  Review them all at (Doc ID 1545562.1). Workflow Concurrent Processing Clone Log Parser Utility (Rapid Clone) Invoices, Payments, Accounting, Suppliers and EBTax Validate Data before Period Close EBTax Setup Payables Trial Balance Internet Expenses AutoInvoice Post-Process ASCP Performance PO Approval iProcurement Items For the Procurement specific Analyzers access them directly at: R12 IP Item Analyzer Diagnostic Script (Doc ID 1586248.1) R12: PO Approval Analyzer Diagnostic Script (Doc ID 1525670.1)

    Read the article

  • Wanting a simple overview on how to connect to a SQLite database in Cocoa/Objective-C

    - by Jesse
    Hi, everyone. I've been experimenting with Cocoa and Objective-C programming on the Mac for a few months now, and I am wanting to start developing applications that manage large amounts of data. The trouble is, I'm not really sure where to start with databases. I have a good background in Java programming with SQLite. I've read a bit about CoreData and I haven't been able to find any good resources for just manually connecting to the database. I'm looking for recommendations. Should I try CoreData, and if so, can anyone recommend a good tutorial for someone new to the language? Or, should I try to manually connect and query an SQLite database somehow, and, if so, any tutorials? Any help would be greatly appreciated! Thanks!

    Read the article

  • Creating STA COM compatible ASP.NET Applications

    - by Rick Strahl
    When building ASP.NET applications that interface with old school COM objects like those created with VB6 or Visual FoxPro (MTDLL), it's extremely important that the threads that are serving requests use Single Threaded Apartment Threading. STA is a COM built-in technology that allows essentially single threaded components to operate reliably in a multi-threaded environment. STA's guarantee that COM objects instantiated on a specific thread stay on that specific thread and any access to a COM object from another thread automatically marshals that thread to the STA thread. The end effect is that you can have multiple threads, but a COM object instance lives on a fixed never changing thread. ASP.NET by default uses MTA (multi-threaded apartment) threads which are truly free spinning threads that pay no heed to COM object marshaling. This is vastly more efficient than STA threading which has a bit of overhead in determining whether it's OK to run code on a given thread or whether some sort of thread/COM marshaling needs to occur. MTA COM components can be very efficient, but STA COM components in a multi-threaded environment always tend to have a fair amount of overhead. It's amazing how much COM Interop I still see today so while it seems really old school to be talking about this topic, it's actually quite apropos for me as I have many customers using legacy COM systems that need to interface with other .NET applications. In this post I'm consolidating some of the hacks I've used to integrate with various ASP.NET technologies when using STA COM Components. STA in ASP.NET Support for STA threading in the ASP.NET framework is fairly limited. Specifically only the original ASP.NET WebForms technology supports STA threading directly via its STA Page Handler implementation or what you might know as ASPCOMPAT mode. For WebForms running STA components is as easy as specifying the ASPCOMPAT attribute in the @Page tag:<%@ Page Language="C#" AspCompat="true" %> which runs the page in STA mode. Removing it runs in MTA mode. Simple. Unfortunately all other ASP.NET technologies built on top of the core ASP.NET engine do not support STA natively. So if you want to use STA COM components in MVC or with class ASMX Web Services, there's no automatic way like the ASPCOMPAT keyword available. So what happens when you run an STA COM component in an MTA application? In low volume environments - nothing much will happen. The COM objects will appear to work just fine as there are no simultaneous thread interactions and the COM component will happily run on a single thread or multiple single threads one at a time. So for testing running components in MTA environments may appear to work just fine. However as load increases and threads get re-used by ASP.NET COM objects will end up getting created on multiple different threads. This can result in crashes or hangs, or data corruption in the STA components which store their state in thread local storage on the STA thread. If threads overlap this global store can easily get corrupted which in turn causes problems. STA ensures that any COM object instance loaded always stays on the same thread it was instantiated on. What about COM+? COM+ is supposed to address the problem of STA in MTA applications by providing an abstraction with it's own thread pool manager for COM objects. It steps in to the COM instantiation pipeline and hands out COM instances from its own internally maintained STA Thread pool. This guarantees that the COM instantiation threads are STA threads if using STA components. COM+ works, but in my experience the technology is very, very slow for STA components. It adds a ton of overhead and reduces COM performance noticably in load tests in IIS. COM+ can make sense in some situations but for Web apps with STA components it falls short. In addition there's also the need to ensure that COM+ is set up and configured on the target machine and the fact that components have to be registered in COM+. COM+ also keeps components up at all times, so if a component needs to be replaced the COM+ package needs to be unloaded (same is true for IIS hosted components but it's more common to manage that). COM+ is an option for well established components, but native STA support tends to provide better performance and more consistent usability, IMHO. STA for non supporting ASP.NET Technologies As mentioned above only WebForms supports STA natively. However, by utilizing the WebForms ASP.NET Page handler internally it's actually possible to trick various other ASP.NET technologies and let them work with STA components. This is ugly but I've used each of these in various applications and I've had minimal problems making them work with FoxPro STA COM components which is about as dififcult as it gets for COM Interop in .NET. In this post I summarize several STA workarounds that enable you to use STA threading with these ASP.NET Technologies: ASMX Web Services ASP.NET MVC WCF Web Services ASP.NET Web API ASMX Web Services I start with classic ASP.NET ASMX Web Services because it's the easiest mechanism that allows for STA modification. It also clearly demonstrates how the WebForms STA Page Handler is the key technology to enable the various other solutions to create STA components. Essentially the way this works is to override the WebForms Page class and hijack it's init functionality for processing requests. Here's what this looks like for Web Services:namespace FoxProAspNet { public class WebServiceStaHandler : System.Web.UI.Page, IHttpAsyncHandler { protected override void OnInit(EventArgs e) { IHttpHandler handler = new WebServiceHandlerFactory().GetHandler( this.Context, this.Context.Request.HttpMethod, this.Context.Request.FilePath, this.Context.Request.PhysicalPath); handler.ProcessRequest(this.Context); this.Context.ApplicationInstance.CompleteRequest(); } public IAsyncResult BeginProcessRequest( HttpContext context, AsyncCallback cb, object extraData) { return this.AspCompatBeginProcessRequest(context, cb, extraData); } public void EndProcessRequest(IAsyncResult result) { this.AspCompatEndProcessRequest(result); } } public class AspCompatWebServiceStaHandlerWithSessionState : WebServiceStaHandler, IRequiresSessionState { } } This class overrides the ASP.NET WebForms Page class which has a little known AspCompatBeginProcessRequest() and AspCompatEndProcessRequest() method that is responsible for providing the WebForms ASPCOMPAT functionality. These methods handle routing requests to STA threads. Note there are two classes - one that includes session state and one that does not. If you plan on using ASP.NET Session state use the latter class, otherwise stick to the former. This maps to the EnableSessionState page setting in WebForms. This class simply hooks into this functionality by overriding the BeginProcessRequest and EndProcessRequest methods and always forcing it into the AspCompat methods. The way this works is that BeginProcessRequest() fires first to set up the threads and starts intializing the handler. As part of that process the OnInit() method is fired which is now already running on an STA thread. The code then creates an instance of the actual WebService handler factory and calls its ProcessRequest method to start executing which generates the Web Service result. Immediately after ProcessRequest the request is stopped with Application.CompletRequest() which ensures that the rest of the Page handler logic doesn't fire. This means that even though the fairly heavy Page class is overridden here, it doesn't end up executing any of its internal processing which makes this code fairly efficient. In a nutshell, we're highjacking the Page HttpHandler and forcing it to process the WebService process handler in the context of the AspCompat handler behavior. Hooking up the Handler Because the above is an HttpHandler implementation you need to hook up the custom handler and replace the standard ASMX handler. To do this you need to modify the web.config file (here for IIS 7 and IIS Express): <configuration> <system.webServer> <handlers> <remove name="WebServiceHandlerFactory-Integrated-4.0" /> <add name="Asmx STA Web Service Handler" path="*.asmx" verb="*" type="FoxProAspNet.WebServiceStaHandler" precondition="integrated"/> </handlers> </system.webServer> </configuration> (Note: The name for the WebServiceHandlerFactory-Integrated-4.0 might be slightly different depending on your server version. Check the IIS Handler configuration in the IIS Management Console for the exact name or simply remove the handler from the list there which will propagate to your web.config). For IIS 5 & 6 (Windows XP/2003) or the Visual Studio Web Server use:<configuration> <system.web> <httpHandlers> <remove path="*.asmx" verb="*" /> <add path="*.asmx" verb="*" type="FoxProAspNet.WebServiceStaHandler" /> </httpHandlers> </system.web></configuration> To test, create a new ASMX Web Service and create a method like this: [WebService(Namespace = "http://foxaspnet.org/")] [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)] public class FoxWebService : System.Web.Services.WebService { [WebMethod] public string HelloWorld() { return "Hello World. Threading mode is: " + System.Threading.Thread.CurrentThread.GetApartmentState(); } } Run this before you put in the web.config configuration changes and you should get: Hello World. Threading mode is: MTA Then put the handler mapping into Web.config and you should see: Hello World. Threading mode is: STA And you're on your way to using STA COM components. It's a hack but it works well! I've used this with several high volume Web Service installations with various customers and it's been fast and reliable. ASP.NET MVC ASP.NET MVC has quickly become the most popular ASP.NET technology, replacing WebForms for creating HTML output. MVC is more complex to get started with, but once you understand the basic structure of how requests flow through the MVC pipeline it's easy to use and amazingly flexible in manipulating HTML requests. In addition, MVC has great support for non-HTML output sources like JSON and XML, making it an excellent choice for AJAX requests without any additional tools. Unlike WebForms ASP.NET MVC doesn't support STA threads natively and so some trickery is needed to make it work with STA threads as well. MVC gets its handler implementation through custom route handlers using ASP.NET's built in routing semantics. To work in an STA handler requires working in the Page Handler as part of the Route Handler implementation. As with the Web Service handler the first step is to create a custom HttpHandler that can instantiate an MVC request pipeline properly:public class MvcStaThreadHttpAsyncHandler : Page, IHttpAsyncHandler, IRequiresSessionState { private RequestContext _requestContext; public MvcStaThreadHttpAsyncHandler(RequestContext requestContext) { if (requestContext == null) throw new ArgumentNullException("requestContext"); _requestContext = requestContext; } public IAsyncResult BeginProcessRequest(HttpContext context, AsyncCallback cb, object extraData) { return this.AspCompatBeginProcessRequest(context, cb, extraData); } protected override void OnInit(EventArgs e) { var controllerName = _requestContext.RouteData.GetRequiredString("controller"); var controllerFactory = ControllerBuilder.Current.GetControllerFactory(); var controller = controllerFactory.CreateController(_requestContext, controllerName); if (controller == null) throw new InvalidOperationException("Could not find controller: " + controllerName); try { controller.Execute(_requestContext); } finally { controllerFactory.ReleaseController(controller); } this.Context.ApplicationInstance.CompleteRequest(); } public void EndProcessRequest(IAsyncResult result) { this.AspCompatEndProcessRequest(result); } public override void ProcessRequest(HttpContext httpContext) { throw new NotSupportedException("STAThreadRouteHandler does not support ProcessRequest called (only BeginProcessRequest)"); } } This handler code figures out which controller to load and then executes the controller. MVC internally provides the information needed to route to the appropriate method and pass the right parameters. Like the Web Service handler the logic occurs in the OnInit() and performs all the processing in that part of the request. Next, we need a RouteHandler that can actually pick up this handler. Unlike the Web Service handler where we simply registered the handler, MVC requires a RouteHandler to pick up the handler. RouteHandlers look at the URL's path and based on that decide on what handler to invoke. The route handler is pretty simple - all it does is load our custom handler: public class MvcStaThreadRouteHandler : IRouteHandler { public IHttpHandler GetHttpHandler(RequestContext requestContext) { if (requestContext == null) throw new ArgumentNullException("requestContext"); return new MvcStaThreadHttpAsyncHandler(requestContext); } } At this point you can instantiate this route handler and force STA requests to MVC by specifying a route. The following sets up the ASP.NET Default Route:Route mvcRoute = new Route("{controller}/{action}/{id}", new RouteValueDictionary( new { controller = "Home", action = "Index", id = UrlParameter.Optional }), new MvcStaThreadRouteHandler()); RouteTable.Routes.Add(mvcRoute);   To make this code a little easier to work with and mimic the behavior of the routes.MapRoute() functionality extension method that MVC provides, here is an extension method for MapMvcStaRoute(): public static class RouteCollectionExtensions { public static void MapMvcStaRoute(this RouteCollection routeTable, string name, string url, object defaults = null) { Route mvcRoute = new Route(url, new RouteValueDictionary(defaults), new MvcStaThreadRouteHandler()); RouteTable.Routes.Add(mvcRoute); } } With this the syntax to add  route becomes a little easier and matches the MapRoute() method:RouteTable.Routes.MapMvcStaRoute( name: "Default", url: "{controller}/{action}/{id}", defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional } ); The nice thing about this route handler, STA Handler and extension method is that it's fully self contained. You can put all three into a single class file and stick it into your Web app, and then simply call MapMvcStaRoute() and it just works. Easy! To see whether this works create an MVC controller like this: public class ThreadTestController : Controller { public string ThreadingMode() { return Thread.CurrentThread.GetApartmentState().ToString(); } } Try this test both with only the MapRoute() hookup in the RouteConfiguration in which case you should get MTA as the value. Then change the MapRoute() call to MapMvcStaRoute() leaving all the parameters the same and re-run the request. You now should see STA as the result. You're on your way using STA COM components reliably in ASP.NET MVC. WCF Web Services running through IIS WCF Web Services provide a more robust and wider range of services for Web Services. You can use WCF over HTTP, TCP, and Pipes, and WCF services support WS* secure services. There are many features in WCF that go way beyond what ASMX can do. But it's also a bit more complex than ASMX. As a basic rule if you need to serve straight SOAP Services over HTTP I 'd recommend sticking with the simpler ASMX services especially if COM is involved. If you need WS* support or want to serve data over non-HTTP protocols then WCF makes more sense. WCF is not my forte but I found a solution from Scott Seely on his blog that describes the progress and that seems to work well. I'm copying his code below so this STA information is all in one place and quickly explain. Scott's code basically works by creating a custom OperationBehavior which can be specified via an [STAOperation] attribute on every method. Using his attribute you end up with a class (or Interface if you separate the contract and class) that looks like this: [ServiceContract] public class WcfService { [OperationContract] public string HelloWorldMta() { return Thread.CurrentThread.GetApartmentState().ToString(); } // Make sure you use this custom STAOperationBehavior // attribute to force STA operation of service methods [STAOperationBehavior] [OperationContract] public string HelloWorldSta() { return Thread.CurrentThread.GetApartmentState().ToString(); } } Pretty straight forward. The latter method returns STA while the former returns MTA. To make STA work every method needs to be marked up. The implementation consists of the attribute and OperationInvoker implementation. Here are the two classes required to make this work from Scott's post:public class STAOperationBehaviorAttribute : Attribute, IOperationBehavior { public void AddBindingParameters(OperationDescription operationDescription, System.ServiceModel.Channels.BindingParameterCollection bindingParameters) { } public void ApplyClientBehavior(OperationDescription operationDescription, System.ServiceModel.Dispatcher.ClientOperation clientOperation) { // If this is applied on the client, well, it just doesn’t make sense. // Don’t throw in case this attribute was applied on the contract // instead of the implementation. } public void ApplyDispatchBehavior(OperationDescription operationDescription, System.ServiceModel.Dispatcher.DispatchOperation dispatchOperation) { // Change the IOperationInvoker for this operation. dispatchOperation.Invoker = new STAOperationInvoker(dispatchOperation.Invoker); } public void Validate(OperationDescription operationDescription) { if (operationDescription.SyncMethod == null) { throw new InvalidOperationException("The STAOperationBehaviorAttribute " + "only works for synchronous method invocations."); } } } public class STAOperationInvoker : IOperationInvoker { IOperationInvoker _innerInvoker; public STAOperationInvoker(IOperationInvoker invoker) { _innerInvoker = invoker; } public object[] AllocateInputs() { return _innerInvoker.AllocateInputs(); } public object Invoke(object instance, object[] inputs, out object[] outputs) { // Create a new, STA thread object[] staOutputs = null; object retval = null; Thread thread = new Thread( delegate() { retval = _innerInvoker.Invoke(instance, inputs, out staOutputs); }); thread.SetApartmentState(ApartmentState.STA); thread.Start(); thread.Join(); outputs = staOutputs; return retval; } public IAsyncResult InvokeBegin(object instance, object[] inputs, AsyncCallback callback, object state) { // We don’t handle async… throw new NotImplementedException(); } public object InvokeEnd(object instance, out object[] outputs, IAsyncResult result) { // We don’t handle async… throw new NotImplementedException(); } public bool IsSynchronous { get { return true; } } } The key in this setup is the Invoker and the Invoke method which creates a new thread and then fires the request on this new thread. Because this approach creates a new thread for every request it's not super efficient. There's a bunch of overhead involved in creating the thread and throwing it away after each thread, but it'll work for low volume requests and insure each thread runs in STA mode. If better performance is required it would be useful to create a custom thread manager that can pool a number of STA threads and hand off threads as needed rather than creating new threads on every request. If your Web Service needs are simple and you need only to serve standard SOAP 1.x requests, I would recommend sticking with ASMX services. It's easier to set up and work with and for STA component use it'll be significantly better performing since ASP.NET manages the STA thread pool for you rather than firing new threads for each request. One nice thing about Scotts code is though that it works in any WCF environment including self hosting. It has no dependency on ASP.NET or WebForms for that matter. STA - If you must STA components are a  pain in the ass and thankfully there isn't too much stuff out there anymore that requires it. But when you need it and you need to access STA functionality from .NET at least there are a few options available to make it happen. Each of these solutions is a bit hacky, but they work - I've used all of them in production with good results with FoxPro components. I hope compiling all of these in one place here makes it STA consumption a little bit easier. I feel your pain :-) Resources Download STA Handler Code Examples Scott Seely's original STA WCF OperationBehavior Article© Rick Strahl, West Wind Technologies, 2005-2012Posted in FoxPro   ASP.NET  .NET  COM   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Database Trends & Applications column: Database Benchmarking from A to Z

    - by KKline
    Have you heard of the monthly print and web magazine Database Trends & Applications (DBTA)? Did you know I'm the regular columnist covering SQL Server ? For the past six months, I've been writing a series of articles about database benchmarking culminating in the latest article discussing my three favorite database benchmarking tools: the free, open-source HammerDB, the native SQL Server Distributed Replay Utility, and the commercial Benchmark Factory from Dell / Quest Software. Wondering what...(read more)

    Read the article

< Previous Page | 213 214 215 216 217 218 219 220 221 222 223 224  | Next Page >