Search Results

Search found 28693 results on 1148 pages for 'oracle advanced security'.

Page 335/1148 | < Previous Page | 331 332 333 334 335 336 337 338 339 340 341 342  | Next Page >

  • 11.2 Upgrade Companion has been updated!

    - by roy.swonger
    The long-awaited update of the 11.2 Upgrade Companion is now available in My Oracle Support, in the usual location (Note 785351.1). This comprehensive update incorporates lessons learned from adoption of the 11.2.0.2 patchset release. We have also included many more links for customers in a RAC/ASM (Grid Infrastructure) environment, information about the GoldenGate 11g release, and more! As always, the Upgrade Companion is available in PDF and HTML format in addition to the web-viewable java-based document.

    Read the article

  • Fusion Applications Enablement Toolkit: the Partner's single place of information for all OPN Fusion Apps resources

    - by Richard Lefebvre
    Take a look and then come back regularly at https://blogs.oracle.com/opnenablement/resource/fusion_applications.html ... a micro site designed to give our EMEA Fusion Partners all the Fusion enablement critical information (Key links, event, materials, etc.) that they need to achieve specialization. This site will be updated on a regular basis, especially for OPN events and training sessions.

    Read the article

  • 466 ADF sample applications and growing - ADF EMG Kaleidoscope announcement

    - by Chris Muir
    Interested in finding more ADF sample applications?  How does 466 applications take your fancy? Today at ODTUG's Kaleidoscope conference in San Antonio the ADF EMG announced the launch of a new ADF Samples website, an index of 466 ADF applications gathered from expert ADF bloggers including customers and Oracle staff. For more details on this great ADF community resource head over to the ADF EMG announcement.

    Read the article

  • How to set up secure cookie on weblogic server

    - by adejuanc
    WebLogic Server allows a user to securely access HTTPS resources in a session that was initiated using HTTP, without loss of session data. To enable this feature, add AuthCookieEnabled="true" to the WebServer element in config.xml: <WebServer Name="myserver" AuthCookieEnabled="true"/>Setting AuthCookieEnabled to true, which is the default setting, causes the WebLogic Server instance to send a new secure cookie, _WL_AUTHCOOKIE_JSESSIONID, to the browser when authenticating via an HTTPS connection. Once the secure cookie is set, the session is allowed to access other security-constrained HTTPS resources only if the cookie is sent from the browser.Thus, WebLogic Server uses two cookies: the JSESSIONID cookie and the _WL_AUTHCOOKIE_JSESSIONID cookie. By default, the JSESSIONID cookie is never secure, but the _WL_AUTHCOOKIE_JSESSIONID cookie is always secure. A secure cookie is only sent when an encrypted communication channel is in use. Assuming a standard HTTPS login (HTTPS is an encrypted HTTP connection), your browser gets both cookies.For subsequent HTTP access, you are considered authenticated if you have a valid JSESSIONID cookie, but for HTTPS access, you must have both cookies to be considered authenticated. If you only have the JSESSIONID cookie, you must re-authenticate.To configure on Admin Console : Log into WebLogic Admin Console. Under Domain Structure, press click on <domainname> Select the "Web Applications" tab Select "Lock and Edit" in change center. Click on  "Auth Cookie Enabled" checkbox. Restart to confirm changes. Test an application and view the cookie which got stored as "JSESSIONID" To Configure the Web application's weblogic-application.xml file: Run the following to extract the file from the web application's weblogic-application.xml: $PATH_JDK_HOME\binjar -xvf easy-web-examples.ear META-INF/weblogic-application.xml Add <cookie-secure>true</cookie-secure> between <session-descriptor> </session-descriptor> to the weblogic-application.xml. Run the following to repackage the file to the application: $PATH_JDK_HOME\bin\jar -uvf easy-web-examples.ear META-INF/weblogic-application.xml Deploy the application into WebLogic For further information, please read the documentation on "Using Secure Cookies to Prevent Session Stealing " : http://download.oracle.com/docs/cd/E12840_01/wls/docs103/security/thin_client.html#wp1053780

    Read the article

  • network endpoint accessible via hostname only, not address?

    - by Dustin Getz
    someone told me that this piece of network hardware (NAS) has a security setting such that it can only be accessed by hostname, not by IP address. I don't understand, as I thought DNS resolved the hostname to an address on the connecting client's side, then at protocol level always used the raw address, so how can this 'security' measure be possible?

    Read the article

  • Installing Forms and Reports on a development system

    - by Duncan Mills
    By popular demand I've resurrected / updated one of the old blog postings from Jan Carlin's Blog on GroundSide here. A recent (lengthy) post on the Forms forums chronicles the problems some of you have had installing F&R on a development machine. See the link in the headline of this post for the main one. When installing, here are some points to bear in mind: Download and install Weblogic Server first. http://www.oracle.com/technology/software/products/middleware/index.htmlFind the Forms and Reports (and Disco and Portal) zip files here. Download them to the desktop (or some other temporary directory of your choosing). Unzip both of the two zip files into the same new directory (maybe called 'stage') and check that you have 4 directories in the stage dir when you are finished unzipping: 'Disk1', 'Disk2', 'Disk3' and 'Disk4'. These folders are specified in the zip file structure and must be preserved for the setup executable to work. If you use WinZip and have a right click menu option that say "Extract to here", use that by right click-dragging the zip file onto the newly created directory. Don't use the "Extract to folder %HOME%\Desktop\ofm_pfrd...disk_1of2" option. That will get you into the trouble that was reported early in this thread. Free up as much memory as you can. Stop services and background processes and virus scanners and databases (you don't need a DB to install Forms) and other things lurking about on your machine. You can restart them when the install is done. Around 1.5 GB free real memory should do it. If it doesn't, free up more if you can. Don't change the swap space unless you know what you are doing. Let Windows handle it. A 1 GB machine will likely not be enough. You will likely need at least 2GB of RAM.Start the install with setup.exe from the 'Disk1' directoryChoose the Install and Configure option unless you have a good reason not to.Choose a unique instance name even if you deinstalled and removed the last install. I suggest using 'asinst_20090722_1' (today's date in ISO format with a running incremented number at the end if you install more than two times on a particular day).Unselect Portal and Discoverer and select the Builders you want.Unselect WebCacheUnselect OHS.Unselect the single sign-on option Check for any failures and choose the retry option if any occur. If that doesn't fix the problem, call Oracle Customer Support .

    Read the article

  • Discover How to Deliver Measurable Business Value from your HCM Strategy

    - by Jay Richey, HCM Product Marketing
    Join our live Webcast on Wednesday, July 13 to learn how to fine tune your HCM strategy and better utlize your Oracle HCM investment.  In this session you'll learn how to access, analyze and act on information from multiple sources to ensure that all workforce decisions are focused on meeting overall business objectives. Date:Wednesday, July 13, 2011Time:10:00 a.m. PT / 1:00 p.m. ET Register now!

    Read the article

  • How to fill DataGridView from nested table oracle

    - by arkadiusz85
    I want to create my type: CREATE TYPE t_read AS OBJECT ( id_worker NUMBER(20), how_much NUMBER(5,2), adddate_r DATE, date_from DATE, date_to DATE ); I create a table of my type: CREATE TYPE t_tab_read AS TABLE OF t_read; Next step is create a table with my type: enter code hereCREATE TABLE Reading ( id_watermeter NUMBER(20) constraint Watermeter_fk1 references Watermeters(id_watermeter), read t_tab_read ) NESTED TABLE read STORE AS store_read ; Microsoft Visual Studio can not display this type in DataGridView. I use Oracle.Command: C# using Oracle.DataAccess; using Oracle.DataAccess.Client; private void button1_Click(object sender, EventArgs e) { try { //my working class to connect to database ConnectionClass.BeginConnection(); OracleDataAdapter tmp = new OracleDataAdapter(); tmp = ConnectionClass.ReadCommand(ReadClass.test()); DataSet dataset4 = new DataSet(); tmp.Fill(dataset4, "Read1"); dataGridView4.DataSource = dataset4.Tables["Read1"]; } catch (Exception o) { MessageBox.Show(o.Message); } public class ReadClass { public static OracleCommand test() { string sql = "select c.id_watermeter, a. from reading c , table (c.read) a where id_watermeter=1"; ConnectionClass.Command1= new OracleCommand(sql, ConnectionClass.Connection); ConnectionClass.Command1.CommandType = CommandType.Text; return ConnectionClass.Command1; } } I tray: string sql = "select r.id_watermeter, o.id_worker, o.how_much, o.adddate_r, o.date_from, o.date_to from reading r, table (r.read) o where r.id_watermeter=1" string sql = "select a.from reading c , Table (c.read) a where id_watermeter=1" string sql = "select a.id_worker, a.how_much, a.adddate_r, a.date_from, a.date_to from reading c , table (c.read) a where id_watermeter=1" string sql = "select c.id_watermeter, a. from reading c , table (c.read) a where id_watermeter=1" Error : Unsuported Oracle data type USERDEFINED encountered Sombady can help me how to fill DataGridView using data from nested table. I am using Oracle 10g XE

    Read the article

  • Webcast Replay Available: E-Business Suite Data Protection

    - by BillSawyer
    I am pleased to release the replay and presentation for the latest ATG Live Webcast: E-Business Suite Data Protection (Presentation)   Robert Armstrong, Product Strategy Security Architect and Eric Bing, Senior Director discussed the best practices and recommendations for securing your E-Business Suite data.Finding other recorded ATG webcasts The catalog of ATG Live Webcast replays, presentations, and all ATG training materials is available in this blog's Webcasts and Training section.

    Read the article

  • OAM11gR2: Enabling SSL in the Data Store

    - by Ekta Malik
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Enabling SSL in the Data Store of OAM11gR2 comprises of the below mentioned steps. Import the certificate/s required for establishing the trust with the Store(backend) in the keystore(cacerts) on the machine hosting OAM's Weblogic Admin server Restart the Weblogic Admin server Specify the <Hostname>:<SSL port> in the "Location" field of the Data Store and select the "Enable SSL" checkbox Pre-requisite:- Certificate/s to be imported are available for import Data Store has already been created using OAM admin console and the connection to the store is successful on non-SSL port( though one can always create a Data Store with SSL settings on the first go) Steps for importing the certificate/s:- One can use the keytool utility that comes bundled with JDK to import the certificate. The step for importing the certificate would be same for self-signed and third party certificates (like VeriSign) $JAVA_HOME/bin/keytool -import -v -noprompt -trustcacerts -alias <aliasname> -file <Path to the certificate file> -keystore $JAVA_HOME/jre/lib/security/cacerts Here $JAVA_HOME refers to the path of JDK install directory Note: In case multiple certificates are required for establishing the trust, import all those certificates using the same keytool command mentioned above  One can verify the import of the certificate/s by using the below mentioned command $JAVA_HOME/bin/keytool -list -alias <aliasname>-v -keystore $JAVA_HOME/jre/lib/security/cacerts When the trust gets established for the SSL communication, specifying the SSL specific settings in the Data Store (via OAM admin console) wouldn't result into the previously seen error (when Certificates are yet to be imported) and the "Test Connection" would be successful.

    Read the article

  • Developing Schema Compare for Oracle (Part 3): Ghost Objects

    - by Simon Cooper
    In the previous blog post, I covered how we solved the problem of dependencies between objects and between schemas. However, that isn’t the end of the issue. The dependencies algorithm I described works when you’re querying live databases and you can get dependencies for a particular schema direct from the server, and that’s all well and good. To throw a (rather large) spanner in the works, Schema Compare also has the concept of a snapshot, which is a read-only compressed XML representation of a selection of schemas that can be compared in the same way as a live database. This can be useful for keeping historical records or a baseline of a database schema, or comparing a schema on a computer that doesn’t have direct access to the database. So, how do snapshots interact with dependencies? Inter-database dependencies don't pose an issue as we store the dependencies in the snapshot. However, comparing a snapshot to a live database with cross-schema dependencies does cause a problem; what if the live database has a dependency to an object that does not exist in the snapshot? Take a basic example schema, where you’re only populating SchemaA: SOURCE   TARGET (using snapshot) CREATE TABLE SchemaA.Table1 ( Col1 NUMBER REFERENCES SchemaB.Table1(col1));   CREATE TABLE SchemaA.Table1 ( Col1 VARCHAR2(100)); CREATE TABLE SchemaB.Table1 ( Col1 NUMBER PRIMARY KEY);   CREATE TABLE SchemaB.Table1 ( Col1 VARCHAR2(100)); In this case, we want to generate a sync script to synchronize SchemaA.Table1 on the database represented by the snapshot. When taking a snapshot, database dependencies are followed, but because you’re not comparing it to anything at the time, the comparison dependencies algorithm described in my last post cannot be used. So, as you only take a snapshot of SchemaA on the target database, SchemaB.Table1 will not be in the snapshot. If this snapshot is then used to compare against the above source schema, SchemaB.Table1 will be included in the source, but the object will not be found in the target snapshot. This is the same problem that was solved with comparison dependencies, but here we cannot use the comparison dependencies algorithm as the snapshot has not got any information on SchemaB! We've now hit quite a big problem - we’re trying to include SchemaB.Table1 in the target, but we simply do not know the status of this object on the database the snapshot was taken from; whether it exists in the database at all, whether it’s the same as the target, whether it’s different... What can we do about this sorry state of affairs? Well, not a lot, it would seem. We can’t query the original database, as it may not be accessible, and we cannot assume any default state as it could be wrong and break the script (and we currently do not have a roll-back mechanism for failed synchronizes). The only way to fix this properly is for the user to go right back to the start and re-create the snapshot, explicitly including the schemas of these 'ghost' objects. So, the only thing we can do is flag up dependent ghost objects in the UI, and ask the user what we should do with it – assume it doesn’t exist, assume it’s the same as the target, or specify a definition for it. Unfortunately, such functionality didn’t make the cut for v1 of Schema Compare (as this is very much an edge case for a non-critical piece of functionality), so we simply flag the ghost objects up in the sync wizard as unsyncable, and let the user sort out what’s going on and edit the sync script as appropriate. There are some things that we do do to alleviate somewhat this rather unhappy situation; if a user creates a snapshot from the source or target of a database comparison, we include all the objects registered from the database, not just the ones in the schemas originally selected for comparison. This includes any extra dependent objects registered through the comparison dependencies algorithm. If the user then compares the resulting snapshot against the same database they were comparing against when it was created, the extra dependencies will be included in the snapshot as required and everything will be good. Fortunately, this problem will come up quite rarely, and only when the user uses snapshots and tries to sync objects with unknown cross-schema dependencies. However, the solution is not an easy one, and lead to some difficult architecture and design decisions within the product. And all this pain follows from the simple decision to allow schema pre-filtering! Next: why adding a column to a table isn't as easy as you would think...

    Read the article

  • BDC Security Issues

    - by geekspt
    We are planning to use BDC to read and probably write to a SQL Server content database on external server. I have heard that there are many security issues that you may run into or should be aware of before setting up BDC. has anyone faced or knows any security issues with BDC. Thanks.

    Read the article

  • Visual Studio 2010, Entity Framework, and Oracle

    - by Tobias Gunn
    While I was working on a SilverLight 4 demo I found out that Entity Framework is not supported directly through the .NET provider or ODP tools. In order to make them work you need to either write a wrapper of your own (wouldn't chance it) or else use a provider like DataDirect or Quest's upcoming tool. So far, I've been very happy with the DataDirect tool (found here http://www.datadirect.com/products/net/index.ssp). As I get a little farther along I'll post more on SL4, RIA, and EF.

    Read the article

  • UPK for Testing Webinar Recording Now Available!

    - by Karen Rihs
    For anyone who missed last week’s event, a recording of the UPK for Testing webinar is now available.  As an implementation and enablement tool, Oracle’s User Productivity Kit (UPK) provides value throughout the software lifecycle.  Application testing is one area where customers like Northern Illinois University (NIU) are finding huge value in UPK and are using it to validate their systems.  Hear Beth Renstrom, UPK Product Manager and Bettylynne Gregg, NIU ERP Coordinator, discuss how the Test It Mode, Test Scripts, and Test Cases of UPK can be used to facilitate applications testing.

    Read the article

  • Using CTAS & Exchange Partition Replace IAS for Copying Partition on Exadata

    - by Bandari Huang
    Usage Scenario: Copy data&index from one partition to another partition in a partitioned table. Solution: Create a partition definition Copy data from one partition to another partiton by 'Insert as select (IAS)' Create a nonpartitioned table by 'Create table as select (CTAS)' Convert a nonpartitioned table into a partition of partitoned table by exchangng their data segments. Rebuild unusable index Exchange Partition Convertion Mutual convertion between a partition (or subpartition) and a nonpartitioned table Mutual convertion between a hash-partitioned table and a partition of a composite *-hash partitioned table Mutual convertiton a [range | list]-partitioned table into a partition of a composite *-[range | list] partitioned table. Exchange Partition Usage Scenario High-speed data loading of new, incremental data into an existing partitioned table in DW environment Exchanging old data partitions out of a partitioned table, the data is purged from the partitioned table without actually being deleted and can be archived separately Exchange Partition Syntax ALTER TABLE schema.table EXCHANGE [PARTITION|SUBPARTITION] [partition|subprtition] WITH TABLE schema.table [INCLUDE|EXCLUDING] INDEX [WITH|WITHOUT] VALIDATION UPDATE [INDEXES|GLOBAL INDEXES] INCLUDING | EXCLUDING INDEXES Specify INCLUDING INDEXES if you want local index partitions or subpartitions to be exchanged with the corresponding table index (for a nonpartitioned table) or local indexes (for a hash-partitioned table). Specify EXCLUDING INDEXES if you want all index partitions or subpartitions corresponding to the partition and all the regular indexes and index partitions on the exchanged table to be marked UNUSABLE. If you omit this clause, then the default is EXCLUDING INDEXES. WITH | WITHOUT VALIDATION Specify WITH VALIDATION if you want Oracle Database to return an error if any rows in the exchanged table do not map into partitions or subpartitions being exchanged. Specify WITHOUT VALIDATION if you do not want Oracle Database to check the proper mapping of rows in the exchanged table. If you omit this clause, then the default is WITH VALIDATION.  UPADATE INDEX|GLOBAL INDEX Unless you specify UPDATE INDEXES, the database marks UNUSABLE the global indexes or all global index partitions on the table whose partition is being exchanged. Global indexes or global index partitions on the table being exchanged remain invalidated. (You cannot use UPDATE INDEXES for index-organized tables. Use UPDATE GLOBAL INDEXES instead.) Exchanging Partitions&Subpartitions Notes Both tables involved in the exchange must have the same primary key, and no validated foreign keys can be referencing either of the tables unless the referenced table is empty.  When exchanging partitioned index-organized tables: – The source and target table or partition must have their primary key set on the same columns, in the same order. – If key compression is enabled, then it must be enabled for both the source and the target, and with the same prefix length. – Both the source and target must be index organized. – Both the source and target must have overflow segments, or neither can have overflow segments. Also, both the source and target must have mapping tables, or neither can have a mapping table. – Both the source and target must have identical storage attributes for any LOB columns. 

    Read the article

  • Brendan Gregg's "Systems Performance: Enterprise and the Cloud"

    - by user12608550
    Long ago, the prerequisite UNIX performance book was Adrian Cockcroft's 1994 classic, Sun Performance and Tuning: Sparc & Solaris, later updated in 1998 as Java and the Internet. As Solaris evolved to include the invaluable DTrace observability features, new essential performance references have been published, such as Solaris Performance and Tools: DTrace and MDB Techniques for Solaris 10 and OpenSolaris (2006)  by McDougal, Mauro, and Gregg, and DTrace: Dynamic Tracing in Oracle Solaris, Mac OS X and FreeBSD (2011), also by Mauro and Gregg. Much has occurred in Solaris Land since those books appeared, notably Oracle's acquisition of Sun Microsystems in 2010 and the demise of the OpenSolaris community. But operating system technologies have continued to improve markedly in recent years, driven by stunning advances in multicore processor architecture, virtualization, and the massive scalability requirements of cloud computing. A new performance reference was needed, and I eagerly waited for something that thoroughly covered modern, distributed computing performance issues from the ground up. Well, there's a new classic now, authored yet again by Brendan Gregg, former Solaris kernel engineer at Sun and now Lead Performance Engineer at Joyent. Systems Performance: Enterprise and the Cloud is a modern, very comprehensive guide to general system performance principles and practices, as well as a highly detailed reference for specific UNIX and Linux observability tools used to examine and diagnose operating system behaviour.  It provides thorough definitions of terms, explains performance diagnostic Best Practices and "Worst Practices" (called "anti-methods"), and covers key observability tools including DTrace, SystemTap, and all the traditional UNIX utilities like vmstat, ps, iostat, and many others. The book focuses on operating system performance principles and expands on these with respect to Linux (Ubuntu, Fedora, and CentOS are cited), and to Solaris and its derivatives [1]; it is not directed at any one OS so it is extremely useful as a broad performance reference. The author goes beyond the intricacies of performance analysis and shows how to interpret and visualize statistical information gathered from the observability tools.  It's often difficult to extract understanding from voluminous rows of text output, and techniques are provided to assist with summarizing, visualizing, and interpreting the performance data. Gregg includes myriad useful references from the system performance literature, including a "Who's Who" of contributors to this great body of diagnostic tools and methods. This outstanding book should be required reading for UNIX and Linux system administrators as well as anyone charged with diagnosing OS performance issues.  Moreover, the book can easily serve as a textbook for a graduate level course in operating systems [2]. [1] Solaris 11, of course, and Joyent's SmartOS (developed from OpenSolaris) [2] Gregg has taught system performance seminars for many years; I have also taught such courses...this book would be perfect for the OS component of an advanced CS curriculum.

    Read the article

  • Filezilla FTP Server - Security Implications of its usage on Windos Server 2003

    - by Brian Webster
    I'm running Filezilla server on my dedicated windows 2003 server. It uses its own user-access control system. The Filezilla server service itself is running under the System user. When I setup users within the FTP Server Administrator Interface, I do not need to setup equivalent users, or adjust permissions on folders to allow users to login. Example: I setup TestFTP user with password 'p' I set the home directoy of TestFTP user to be e:/website I verify that e:/website only has permission for the System and Admin accounts (right click - security in windows explorer) TestFTP is able to login to the server just fine. I'm OK with this (perhaps due to ignorance?). Is it generally frounded upon to utilize a FTP Server such as FileZilla Server that bypasses the built-in UAC in this method? If I wasn't clear enough, please let me know.

    Read the article

  • Winxp system context menus blank after last security update

    - by Peter Rowell
    Because of a CERT advisory of several out-of-band security updates released by MS I did a Critical Updates pass on my WinXP Pro SP3 machine. I now have the situation where it seems that all of my WinXP-generated menus come up with all items black. If I wave the cursor over the menu, the items will update as they become active (go blue) and then update correctly (to black-text-on-white-background) as they go non-active. Separators (which never get a hover event) stay black. App-level context menus seem to work fine (Firefox, OpenOffice, etc.) with the exception of Windows Explorer and Internet Explorer, which both exhibit this behavior on both their context menus and on their menu bar drop down menus. I'm assuming that's true because they are all using the same library code. Thoughts? Fixes? Help!

    Read the article

  • vncserver too many security failures

    - by cf16
    I try to connect to my vncserver running on CentOs from home computer, behind firewall. I have installed Win7 and Ubuntu both on this machine. I have an error: VNC conenction failed: vncserver too many security failures even when loging with right credentials (I reset passwd on CentOs). Is it something regarding that I try as root? I think important is also that I have to login to remote Centos through port 6050 - none else port works for me. Do I have to do something with other ports? I see that vncserver is listening on 5901, 5902 if another added - and I consider connection is established because from time to time (long time) the passwd prompt appears,... right? please help, what to do? even if prompt appeared and I put correct password I get: authentication failure. how to disable this lockout for a testing purposes?

    Read the article

  • urlQuey and Security

    - by jasmine
    In url query with id I use is_numeric($_GET['id']) for security issues. But in query with for example category name, is urlencode() a right way for security? Thanks in advance.

    Read the article

  • Wireless not detected on an hp Pavilion g6 - RT5390PCIe

    - by Earl Larson
    I just bought an HP Pavilion g6 laptop and I installed Natty Narwhal alongside Windows 7. In Windows 7, my WiFi works perfect but in Ubuntu it absolutely (no matter what I do) won't detect my home WiFi network. Here is the output of "lspci": earl@ubuntu:~$ lspci 00:00.0 Host bridge: Advanced Micro Devices [AMD] RS880 Host Bridge 00:01.0 PCI bridge: Advanced Micro Devices [AMD] RS780/RS880 PCI to PCI bridge (int gfx) 00:04.0 PCI bridge: Advanced Micro Devices [AMD] RS780 PCI to PCI bridge (PCIE port 0) 00:05.0 PCI bridge: Advanced Micro Devices [AMD] RS780 PCI to PCI bridge (PCIE port 1) 00:06.0 PCI bridge: Advanced Micro Devices [AMD] RS780 PCI to PCI bridge (PCIE port 2) 00:11.0 SATA controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 SATA Controller [AHCI mode] 00:12.0 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:12.2 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:13.0 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:13.2 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:14.0 SMBus: ATI Technologies Inc SBx00 SMBus Controller (rev 42) 00:14.1 IDE interface: ATI Technologies Inc SB7x0/SB8x0/SB9x0 IDE Controller (rev 40) 00:14.2 Audio device: ATI Technologies Inc SBx00 Azalia (Intel HDA) (rev 40) 00:14.3 ISA bridge: ATI Technologies Inc SB7x0/SB8x0/SB9x0 LPC host controller (rev 40) 00:14.4 PCI bridge: ATI Technologies Inc SBx00 PCI to PCI Bridge (rev 40) 00:14.5 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB OHCI2 Controller 00:16.0 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:16.2 USB Controller: ATI Technologies Inc SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:18.0 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor HyperTransport Configuration 00:18.1 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor Address Map 00:18.2 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor DRAM Controller 00:18.3 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor Miscellaneous Control 00:18.4 Host bridge: Advanced Micro Devices [AMD] Family 10h Processor Link Control 01:05.0 VGA compatible controller: ATI Technologies Inc M880G [Mobility Radeon HD 4200] 01:05.1 Audio device: ATI Technologies Inc RS880 Audio Device [Radeon HD 4200] 02:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E PCI Express Fast Ethernet controller (rev 05) 03:00.0 Network controller: Ralink corp. Device 5390 04:00.0 Unassigned class [ff00]: Realtek Semiconductor Co., Ltd. Device 5209 (rev 01) earl@ubuntu:~$

    Read the article

  • WebCenter Customer Spotlight: College of American Pathologists

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution Summary College of American Pathologists Goes Live with OracleWebCenter - Imaging, AP Invoice Automation, and EBS Managed Attachment with Support for Imaging ContentThe College of American Pathologists (CAP), the leading organization of board-certified pathologists serving more then 18,000 physician members, 7,000 laboratories are accredited by the CAP, and approximately 22,000 laboratories are enrolled in the College’s proficiency testing programs. The business objective was to content-enable their Oracle E-Business Suite (EBS) enterprise application by combining the best of Imaging and Manage Attachment functionality providing a unique opportunity for the business to have unprecedented access to both structure and unstructured content from within their enterprise application. The solution improves customer services turnaround time, provides better compliance and improves maintenance and management of the technology infrastructure. Company OverviewThe College of American Pathologists (CAP), celebrating 50 years as the gold standard in laboratory accreditation, is a medical society serving more than 17,000 physician members and the global laboratory community. It is the world’s largest association composed exclusively of board certified pathologists and is the worldwide leader in laboratory quality assurance. The College advocates accountable, high-quality, and cost-effective patient care. The more than 17,000 pathologist members of the College of American Pathologists represent board-certified pathologists and pathologists in training worldwide. More than 7,000 laboratories are accredited by the CAP, and approximately 23,000 laboratories are enrolled in the College’s proficiency testing programs.  Business ChallengesThe CAP business objective was to content-enable their Oracle E-Business Suite (EBS) enterprise application by combining the best of Imaging and Manage Attachment functionality providing a unique opportunity for the business to have unprecedented access to both structure and unstructured content from within their enterprise application.  Bring more flexibility to systems and programs in order to adapt quickly Get a 360 degree view of the customer Reduce cost of running the business Solution DeployedWith the help of Oracle Consulting, the customer implemented Oracle WebCenter Content as the centralized E-Business Suite Document Repository.  The solution enables to capture, present and manage all unstructured content (PDFs,word processing documents, scanned images, etc.) related to Oracle E-Business Suite transactions and exposing the related content using the familiar EBS User Interface. Business ResultsThe CAP achieved following benefits from the implemented solution: Managed Attachment Solution Align with strategic Oracle Fusion Middleware platform Integrate with the CAP existing data capture capabilities Single user interface provided by the Managed Attachment solution for all content Better compliance and improved collaboration  Account Payables Invoice Processing Imaging Solution Automated invoice management eliminating dependency on paper materials and improving compliance, collaboration and accuracy A single repository to house and secure scanned invoices and all supplemental documents Greater management visibility of invoice entry process Additional Information CAP OpenWorld Presentation Oracle WebCenter Content Oracle Webcenter Capture Oracle WebCenter Imaging Oracle  Consulting

    Read the article

  • Save the date! Manageability Partner Community Forum at Oracle Openworld - Oct. 1st

    - by Javier Puerta
    The Exadata & Manageability Partner Communities will be celebrating a Community Forum in San Francisco during Oracle Openworld. The session will take place on Monday, October 1st, from 4:00 - 6:00 pm local time.If you would like to present an experience around a customer project or sales best practice in the Manageability or Quality & Testing areas, please contact [email protected] with a short description of your proposal.

    Read the article

  • Even More Storage Options in VDI 3.4.1

    - by mprove
    Oracle Virtual Desktop Infrastructure 3.4.1 has been released to complete the storage matrix below. Storage Type VirtualBox on Solaris VirtualBox on Enterprise Linux Sun ZFS yes yes Sun ZFS (pool on Solaris) yes yes iSCSI - new in VDI 3.4 Network File System new in VDI 3.4.1 new in VDI 3.4 Local Storage new in VDI 3.4.1 new in VDI 3.4

    Read the article

< Previous Page | 331 332 333 334 335 336 337 338 339 340 341 342  | Next Page >