Search Results

Search found 1098 results on 44 pages for 'jan fosgerau'.

Page 29/44 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • How to install mod_ssl for Apache

    - by Nick Foote
    Ok So I installed Apache httpd a while ago and have recently come back to it to try setup SSL and get it serving several different tomcat servers. At the moment I have two completely separate tomcat instances serving up to slightly different versions (one for dev and one for demo say) my web app to two different ports; mydomain.com:8081 and mydomain.com:8082 I've successfully (back in Jan) used mod_jk to get httpd to serve those same tomcat instances to http://www.mydomain.com:8090/dev and http://www.mydomain.com:8090/demo (8090 cos I've got another app running on 8080 via Jetty at this stage) using the following code in httpd.conf; LoadModule jk_module modules/mod_jk.so JkWorkersFile conf/workers.properties JkLogFile logs/mod_jk.log JkLogLevel debug <VirtualHost *:8090> JkMount /devd* tomcatDev JkMount /demo* tomcatDemo </VirtualHost> What I'm not trying to do is enable SSL I've added the following to httpd.conf Listen 443 <VirtualHost _default_:443> JkMount /dev* tomcatDev JkMount /demo* tomcatDemo SSLEngine on SSLCertificateFile "/opt/httpd/conf/localhost.crt" SSLCertificateKeyFile "/opt/httpd/conf/keystore.key" </VirtualHost> But when I try to restart Apache with "apachectl restart" (yes after shutting down that other app I mentioned so it doesn't toy with https connections) I continuously get the error; "Invalid command 'SSLEngine', perhaps misspelled or defined by a module not included in the server configuration. httpd not running, trying to start" I've looked in the httpd/modules dir and indeed there is no mod_ssl, only mod_jk.so and httpd.exp. I've tried using yum to install mod_ssl, it says its already installed. Indeed I can locate mod_ssl.so in /usr/lib/httpd/modules but this is NOT the path to where I've installed httpd which is /opt/httpd and in fact /usr/lib/httpd contains nothing but the modules dir. Can anyone tell me how to install mod_ssl properly for my installed location of httpd so I can get past this error:

    Read the article

  • EPM System Standard Deployment Video Series

    - by Paul Anderson -Oracle
    (in via Jan) A four-part video series on deploying Enterprise Performance Management (EPM) System Products has been made available within the Oracle EPMWebcasts YouTube channel. This video series is designed for system administrators. It provides an overview of how to deploy EPM System products using the standard deployment methodology. Deploying EPM System Using Standard Deployment video series: Part 1 - Overview [3:12] Describes the EPM standard deployment and details each of the videos in the series. Part 2 - Preparing for Deployment [3:41] Discusses how to prepare for an EPM standard deployment. Part 3 - Installing and Configuring an Initial Instance of EPM System [4:11] Outlines the steps to install and configure an initial instance of EPM System components.. Part 4 - Scaling Out and Installing EPM System Clients [4:00] Provides an overview of the steps to scale out EPM System components and install EPM System client software. More information is available within the PDF document: EPM System Standard Deployment Guide 11.1.2.3 To view and and access other Oracle EPM Webcast videos visit: Oracle EPM Webcasts YouTube Channel To view and download all of the EPM product documentation visit the Oracle Technology Network (OTN) EPM Documentation Library. In addition to the Oracle EPM Webcasts YouTube channel these videos along with other EPM related product videos and information are available in the Oracle Learning Library (OLL) - visit: Oracle Learning Library - EPM Consolidation and Planning Videos

    Read the article

  • Uganda .NET Usergroup meeting (February 2010)

    - by Malisa L. Ncube
    We had a very interesting .NET meeting in which i gave a short presentation on the new features of .NET 4.0 and Visual Studio 2010. The main presentation came from Jake Markhus @jmarkhus who talked about NHibernate (http://nhforge.org).   The membership for the group is growing each time we meet and its very encouraging. Some employers found candidates within our group for hiring, and this has resulted in a mutual benefit between the employers and job-seekers – really cool stuff.   Jake above giving us the soup and nuts of NHibernate. We were very excited since a number companies decided to support us in many ways. Some members won the licenses for Telerik Suite (http://www.telerik.com) while Intellisense LTD (http://www.intellisense.co.ug) agreed to sponsor us on domain registration and hosting of the community site. Jake announced on the possibility of having DevExpress (http://www.devexpress.com) sponsor us on 2 licenses of CodeRush http://www.devexpress.com/coderush which would be won in the next meeting raffle. I (@malisancube) would like to thank all the companies and individuals who have decided to sponsor us as a community of developers. We appreciate your support very much. You can download the presentations here Jan 2010 “VS2010 and NET 4.0” by Malisa Ncube. Feb 2010 “Using NHibernate” by Jake Markhus   Technorati Tags: Uganda .NET Usergroup,.NET 4.0,Community,NHibernate

    Read the article

  • SQL SERVER – Import CSV into Database – Transferring File Content into a Database Table using CSVexpress

    - by pinaldave
    One of the most common data integration tasks I run into is a desire to move data from a file into a database table.  Generally the user is familiar with his data, the structure of the file, and the database table, but is unfamiliar with data integration tools and therefore views this task as something that is difficult.  What these users really need is a point and click approach that minimizes the learning curve for the data integration tool.  This is what CSVexpress (www.CSVexpress.com) is all about!  It is based on expressor Studio, a data integration tool I’ve been reviewing over the last several months. With CSVexpress, moving data between data sources can be as simple as providing the database connection details, describing the structure of the incoming and outgoing data and then connecting two pre-programmed operators.   There’s no need to learn the intricacies of the data integration tool or to write code.  Let’s look at an example. Suppose I have a comma separated value data file with data similar to the following, which is a listing of terminated employees that includes their hiring and termination date, department, job description, and final salary. EMP_ID,STRT_DATE,END_DATE,JOB_ID,DEPT_ID,SALARY 102,13-JAN-93,24-JUL-98 17:00,Programmer,60,"$85,000" 101,21-SEP-89,27-OCT-93 17:00,Account Representative,110,"$65,000" 103,28-OCT-93,15-MAR-97 17:00,Account Manager,110,"$75,000" 304,17-FEB-96,19-DEC-99 17:00,Marketing,20,"$45,000" 333,24-MAR-98,31-DEC-99 17:00,Data Entry Clerk,50,"$35,000" 100,17-SEP-87,17-JUN-93 17:00,Administrative Assistant,90,"$40,000" 334,24-MAR-98,31-DEC-98 17:00,Sales Representative,80,"$40,000" 400,01-JAN-99,31-DEC-99 17:00,Sales Manager,80,"$55,000" Notice the concise format used for the date values, the fact that the termination date includes both date and time information, and that the salary is clearly identified as money by the dollar sign and digit grouping.  In moving this data to a database table I want to express the dates using a format that includes the century since it’s obvious that this listing could include employees who left the company in both the 20th and 21st centuries, and I want the salary to be stored as a decimal value without the currency symbol and grouping character.  Most data integration tools would require coding within a transformation operation to effect these changes, but not expressor Studio.  Directives for these modifications are included in the description of the incoming data. Besides starting the expressor Studio tool and opening a project, the first step is to create connection artifacts, which describe to expressor where data is stored.  For this example, two connection artifacts are required: a file connection, which encapsulates the file system location of my file; and a database connection, which encapsulates the database connection information.  With expressor Studio, I use wizards to create these artifacts. First click New Connection > File Connection in the Home tab of expressor Studio’s ribbon bar, which starts the File Connection wizard.  In the first window, I enter the path to the directory that contains the input file.  Note that the file connection artifact only specifies the file system location, not the name of the file. Then I click Next and enter a meaningful name for this connection artifact; clicking Finish closes the wizard and saves the artifact. To create the Database Connection artifact, I must know the location of, or instance name, of the target database and have the credentials of an account with sufficient privileges to write to the target table.  To use expressor Studio’s features to the fullest, this account should also have the authority to create a table. I click the New Connection > Database Connection in the Home tab of expressor Studio’s ribbon bar, which starts the Database Connection wizard.  expressor Studio includes high-performance drivers for many relational database management systems, so I can simply make a selection from the “Supplied database drivers” drop down control.  If my desired RDBMS isn’t listed, I can optionally use an existing ODBC DSN by selecting the “Existing DSN” radio button. In the following window, I enter the connection details.  With Microsoft SQL Server, I may choose to use Windows Authentication rather than rather than account credentials.  After clicking Next, I enter a meaningful name for this connection artifact and clicking Finish closes the wizard and saves the artifact. Now I create a schema artifact, which describes the structure of the file data.  When expressor reads a file, all data fields are typed as strings.  In some use cases this may be exactly what is needed and there is no need to edit the schema artifact.  But in this example, editing the schema artifact will be used to specify how the data should be transformed; that is, reformat the dates to include century designations, change the employee and job ID’s to integers, and convert the salary to a decimal value. Again a wizard is used to create the schema artifact.  I click New Schema > Delimited Schema in the Home tab of expressor Studio’s ribbon bar, which starts the Database Connection wizard.  In the first window, I click Get Data from File, which then displays a listing of the file connections in the project.  When I click on the file connection I previously created, a browse window opens to this file system location; I then select the file and click Open, which imports 10 lines from the file into the wizard. I now view the file’s content and confirm that the appropriate delimiter characters are selected in the “Field Delimiter” and “Record Delimiter” drop down controls; then I click Next. Since the input file includes a header row, I can easily indicate that fields in the file should be identified through the corresponding header value by clicking “Set All Names from Selected Row. “ Alternatively, I could enter a different identifier into the Field Details > Name text box.  I click Next and enter a meaningful name for this schema artifact; clicking Finish closes the wizard and saves the artifact. Now I open the schema artifact in the schema editor.  When I first view the schema’s content, I note that the types of all attributes in the Semantic Type (the right-hand panel) are strings and that the attribute names are the same as the field names in the data file.  To change an attribute’s name and type, I highlight the attribute and click Edit in the Attributes grouping on the Schema > Edit tab of the editor’s ribbon bar.  This opens the Edit Attribute window; I can change the attribute name and select the desired type from the “Data type” drop down control.  In this example, I change the name of each attribute to the name of the corresponding database table column (EmployeeID, StartingDate, TerminationDate, JobDescription, DepartmentID, and FinalSalary).  Then for the EmployeeID and DepartmentID attributes, I select Integer as the data type, for the StartingDate and TerminationDate attributes, I select Datetime as the data type, and for the FinalSalary attribute, I select the Decimal type. But I can do much more in the schema editor.  For the datetime attributes, I can set a constraint that ensures that the data adheres to some predetermined specifications; a starting date must be later than January 1, 1980 (the date on which the company began operations) and a termination date must be earlier than 11:59 PM on December 31, 1999.  I simply select the appropriate constraint and enter the value (1980-01-01 00:00 as the starting date and 1999-12-31 11:59 as the termination date). As a last step in setting up these datetime conversions, I edit the mapping, describing the format of each datetime type in the source file. I highlight the mapping line for the StartingDate attribute and click Edit Mapping in the Mappings grouping on the Schema > Edit tab of the editor’s ribbon bar.  This opens the Edit Mapping window in which I either enter, or select, a format that describes how the datetime values are represented in the file.  Note the use of Y01 as the syntax for the year.  This syntax is the indicator to expressor Studio to derive the century by setting any year later than 01 to the 20th century and any year before 01 to the 21st century.  As each datetime value is read from the file, the year values are transformed into century and year values. For the TerminationDate attribute, my format also indicates that the datetime value includes hours and minutes. And now to the Salary attribute. I open its mapping and in the Edit Mapping window select the Currency tab and the “Use currency” check box.  This indicates that the file data will include the dollar sign (or in Europe the Pound or Euro sign), which should be removed. And on the Grouping tab, I select the “Use grouping” checkbox and enter 3 into the “Group size” text box, a comma into the “Grouping character” text box, and a decimal point into the “Decimal separator” character text box. These entries allow the string to be properly converted into a decimal value. By making these entries into the schema that describes my input file, I’ve specified how I want the data transformed prior to writing to the database table and completely removed the requirement for coding within the data integration application itself. Assembling the data integration application is simple.  Onto the canvas I drag the Read File and Write Table operators, connecting the output of the Read File operator to the input of the Write Table operator. Next, I select the Read File operator and its Properties panel opens on the right-hand side of expressor Studio.  For each property, I can select an appropriate entry from the corresponding drop down control.  Clicking on the button to the right of the “File name” text box opens the file system location specified in the file connection artifact, allowing me to select the appropriate input file.  I indicate also that the first row in the file, the header row, should be skipped, and that any record that fails one of the datetime constraints should be skipped. I then select the Write Table operator and in its Properties panel specify the database connection, normal for the “Mode,” and the “Truncate” and “Create Missing Table” options.  If my target table does not yet exist, expressor will create the table using the information encapsulated in the schema artifact assigned to the operator. The last task needed to complete the application is to create the schema artifact used by the Write Table operator.  This is extremely easy as another wizard is capable of using the schema artifact assigned to the Read Table operator to create a schema artifact for the Write Table operator.  In the Write Table Properties panel, I click the drop down control to the right of the “Schema” property and select “New Table Schema from Upstream Output…” from the drop down menu. The wizard first displays the table description and in its second screen asks me to select the database connection artifact that specifies the RDBMS in which the target table will exist.  The wizard then connects to the RDBMS and retrieves a list of database schemas from which I make a selection.  The fourth screen gives me the opportunity to fine tune the table’s description.  In this example, I set the width of the JobDescription column to a maximum of 40 characters and select money as the type of the LastSalary column.  I also provide the name for the table. This completes development of the application.  The entire application was created through the use of wizards and the required data transformations specified through simple constraints and specifications rather than through coding.  To develop this application, I only needed a basic understanding of expressor Studio, a level of expertise that can be gained by working through a few introductory tutorials.  expressor Studio is as close to a point and click data integration tool as one could want and I urge you to try this product if you have a need to move data between files or from files to database tables. Check out CSVexpress in more detail.  It offers a few basic video tutorials and a preview of expressor Studio 3.5, which will support the reading and writing of data into Salesforce.com. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Converting a JD Edwards Date to a System.DateTime

    - by Christopher House
    I'm working on moving some data from JD Edwards to a SQL Server database using SSIS and needed to deal with the way in which JDE stores dates.  The format is CYYDDD, where: C = century, 1 for >= 2000 and 0 for < 2000 YY = the last two digits of the year DDD = the number of the day.  Jan 1 = 1, Dec. 31 = 365 (or 366 in a leap year) The .Net base class library has lots of good support for handling dates, but nothing as specific as the JD Edwards format, so I needed to write a bit of code to translate the JDE format to System.DateTime.  The function is below: public static DateTime FromJdeDate(double jdeDate) {   DateTime convertedDate = DateTime.MinValue;   if (jdeDate >= 30001 && jdeDate <= 200000)   {     short yearValue = (short)(jdeDate / 1000d + 1900d);     short dayValue = (short)((jdeDate % 1000) - 1);     convertedDate = DateTime.Parse("01/01/" + yearValue.ToString()).AddDays(dayValue);   }   else   {     throw new ArgumentException("The value provided does not represent a valid JDE date", "jdeDate");   }   return convertedDate; }  I'd love to take credit for this myself, but this is an adaptation of a TSQL UDF that I got from another consultant at the client site.

    Read the article

  • ArchBeat Link-o-Rama for 11/22/2011

    - by Bob Rhubart
    A Brief Introduction on Migrating to an Oracle-based Cloud Environment | Tom Laszewski "Before you can start migrating to the cloud, you must define what the cloud means to you," says Tom Laszeski. "The cloud is not a specific software or hardware product; contrary to what many technology vendors would have you believe." Custom Exception Registration for ADF BC EO Attribute | Andrejus Baranovskis "Sometimes customers prefer to implement business logic validation completely in Java, without using ADF BC declarative/Groovy validation rules," says Oracle ACE Director Andrejus Baranovskis. "Thats fine, we can code business logic validation in ADF and implement different custom validation methods on VO/EO level." Oracle Exadata Virtual Conference - Jan 20 2012 The Exadata SIG, along with IOUG, is organizing the First Exadata Virtual Conference, to be held on January 20, 2012. Proposals for presentations are now being accepted. Smooth Sailing or Rough Waters: Navigating Policy Administration Modernization | Helen Pitts "It’s no surprise that fueling growth, both now and in the future, continues to be a key driver for modernization" says Helen Pitts. "Why? Inflexible, hard-coded, legacy systems require customization by IT every time a change is required." Architects putting on the Ritz; Info integration book learning; Platform for SAS Grid Computing This week on the Architect Home Page on OTN. Webcast: Introducing Oracle WebLogic Server 12c: Developer Deep Dive - Dec 1 - 11am PT / 2pm ET Learn how Oracle WebLogic Server 12c enables rapid development of modern, lightweight Java EE 6 applications. Discover how you can leverage the latest development technologies, tools and standards when deploying to Oracle WebLogic Server across both conventional and Cloud environments. Architecture all day. Oracle Technology Network Architect Day - Phoenix, AZ - Dec14. Free registration. When: December 14, 2011 Where: The Ritz-Carlton, Phoenix, 2401 East Camelback Road, Phoenix, AZ 85016 Registration is free, but seating is limited.

    Read the article

  • Happy Chinese New Year!

    - by Shaun
    Today is Dec the 29th in Chinese Traditional Calendar, that means on Thursday (3rd of Feb) we will have the Chinese New Year! For those who doesn’t know about the Chinese New Year please visit the wikipedia site. This is the most important holiday not only for the Chinese in China, but the Chinese all around the world. Here I would like to say: ????. (Chun Jie Kuai Le, Happy Chinese New Year). OK I have 3 news with my celebration: The new windows azure developer portal had been published for a while and the windows azure team wants to get to know how do we think about it. Here is a survey avaiable you can send your feedback. PS, please refer to my previous blog for the features of this new site. The latest Window Azure Platform Training Kit Jan Update had been released that you can download here. There is a demo and a hands-on lab about the Windows Phone 7 application with Windows Azure avaiable which should be interesting. If you have heard about the new feature for SQL Azure named SQL Azure Federation, you might know that it’s a cool feature and solution about database sharding. But for now there seems no similar solution for normal SQL Server and local database. I had created a library named PODA, which stands for Partition Oriented Data Access which partially implemented the features of SQL Azure Federation. I’m going to explain more about this project after the Chinese New Year but you can download the source code here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Duke's Choice Award Goes Regional

    - by Tori Wieldt
    We are pleased to announce the expansion of the Duke's Choice Award program to include regional awards in conjunction with each international JavaOne conference.  The expanded Duke's Choice Award program celebrates Java innovation happening within specific regions and provides an opportunity to recognize winners locally. Regions include Latin America (LAD), Europe Africa Middle East (EMEA), and Asia.  The global program will  continue in association with the flagship JavaOne conference.  First up: Duke's Choice Awards LAD.  Three winners will be announced on stage during JavaOne Latin America December 4th to 6th and in the Jan/Feb issue of Java Magazine.   Submit your nominations now through October 30th!  Nominations are accepted from anyone, including Oracle employees,  for compelling uses of Java technology or community involvement.  Duke's Choice Awards LAD judges include community members Yara Senger (Brazil) and Alexis Lopez (Colombia). In keeping with the 10 year tradition of the Duke's Choice Award program, the most important ingredient is innovation. Let's recognize and celebrate the innovation that Java delivers within Latin America! www.java.net/dukeschoiceLAD To see the 2012 global Duke's Choice Awards winners now, subscribe to Java Magazine

    Read the article

  • winusb formatting and installation error, lost data storage on 32gb flash drive

    - by Cary Felton
    i recently purchased a pny brand 32gb usb drive to use for configuring a dual boot on my computer, and for hdd files backup. i also recently installed zorin os 6.1 as my main os, and then downloaded winusb for linux, and set it to install the iso for windows 8 release preview as the activation code is still good untill jan.2013, and there was an error on the installation. i rebooted the computer, plugged in the usb drive, and i went from 32gb usable space to somehow having a partitioned drive (mounted as two separate drives) one being 3.7gb, and the other being i believe 25gb or so... i then formatted the drive with gparted back to fat32, and it still read as a windows usb with the installation files on it still? so then i took my usb drive to my library which runs windows 8, i installed bootice, and completely reformatted my usb drive, and somehow it is usable, but still not perfect. it reads i only have 29.9gb usable space... i have already thought of the non usable area, and that doesnt account for this error as when i first bought the drive and plugged it in on linux, it read total drive space was 32.2 and exactly 32 was usable. somehow i am short by 2gb of space which is very critical for what i am doing. i love linux, i only needed windows for bluray playback with daplayer as it wont work well in wine, but if i keep losing space on my usb drives im afraid ill have to switch back. any help would be appreciated as i already visited pny's site, and they have no support for this issue, and im not that fond of partitions, file systems, or formatting.

    Read the article

  • Speaker at the German Visual FoxPro Developer Conference 2005

    The following is an excerpt from the UniversalThread conference coverage of the German Visual FoxPro Developer Conference 2005 written by Armin Neudert and Jan Vit. Unfortunately, my sessions were not covered at all but I was there as a speaker after all: [...] We are happy to welcome back several speakers that have already been giving sessions in previous DevCons, but hadn’t been here for one or more years. In detail: Steven Black is back after several years. Marcia Akins and her husband Andy Kramek couldn’t come in 2004 and are back again now. Regarding German speakers, Andreas Flohr and Torsten Weggen are also here again, after not doing sessions for two, respectively four years at this conference. At this point we would like to send some regards to the speakers that couldn’t come to Frankfurt this year, since they are very busy at the moment or are doing sessions anywhere else in the world right now. We are also proud to announce several speakers that are here for the very first time. Welcome to Doug Hennig, Rick Schumer, Craig Berntson, Marcus Luz and Benjamin Anders. And of course, there all the well known speakers which did great sessions over the last years: Sebastian Flucke, Uwe Habermann, Peter Herzog, Venelina Jordanova, Dan Jurden, Jochen Kirstätter, Nathalie Mengel, Lisa Slater Nichols, Michael Niethammer, Rick Strahl, Markus Winhard, Eugen Wirsing, Christof Wollenhaupt and myself - Armin Neudert :-) [...]

    Read the article

  • Java Spotlight Episode 112: Joonas Lehiten on @Vaadin

    - by Roger Brinkley
    Interview with Joonas Lehtinen on Vaadin. Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link:  Java Spotlight Podcast in iTunes. Show Notes News Java Smart Metering video JavaFX for Tablets and Mobile survey on FXExperience Muliple JSR Migrating to the Latest JCP Version A number of JEPs added to  JDK 8 features and JDK 8 Milestones Adopt-a-JSR for Java EE 7 Events Dec 14-15, IndicThreads, Pune, India Dec 20, 9:30am JCP Spec Lead Call December on Developing a TCK Jan 15-16, JCP EC Face to Face Meeting, West Coast USA Feature InterviewJoonas Lehtinen started the development of Vaadin, a Java-based open source framework for building business-oriented Rich Internet Applications. He has been developing applications for the web since 1995 with a strong focus on Ajax and Java. He is also the founder and CEO of the company behind the Vaadin framework. What’s Cool Hinkmond Wong’s work with RasberryPI and Java Embedded GPIO Collaborative Whiteboard using WebSocket

    Read the article

  • Certifications in the new Certify

    - by richard.miller
    The most up-to-date certifications are now available in Certify - New Additions Feb 2011! What's not yet available can still be found in Classic Certify. We think that the new search will save you a ton of time and energy, so try it out and let us know. NOTE: Not all cert information is in the new system. If you type in a product name and do not find it, send us feedback so we can find the team to add it!. Also, we have been listening to every feedback message coming in. We have plans to make some improvements based on your feedback AND add the missing data. Thanks for your help! Japanese ??? Note: Oracle Fusion Middleware certifications are available via oracle.com: Fusion Middleware Certifications. Certifications viewable in the new Certify Search Oracle Database Oracle Database Options Oracle Database Clients (they apply to both 32-bit and 64-bit) Oracle Enterprise Manager Oracle Beehive Oracle Collaboration Suite Oracle E-Business Suite, Now with Release 11i & 12! Oracle Siebel Customer Relationship Management (CRM) Oracle Governance, Risk, and Compliance Management Oracle Financial Services Oracle Healthcare Oracle Life Sciences Oracle Enterprise Taxation Management Oracle Retail Oracle Utilities Oracle Cross Applications Oracle Primavera Oracle Agile Oracle Transportation Management (G-L) Oracle Value Chain Planning Oracle JD Edwards EnterpriseOne (NEW! Jan 2011) 8.9+ and SP23+ Oracle JD Edwards World (A7.3, A8.1, A9.1, and A9.2) Certifications viewable in Classic Certify Classic certify is the "old" user interface. Clicking the "Classic Certify" link from Certifications QuickLinks will take you there. Enterprise PeopleTools Release 8.49, Release 8.50, and Release 8.51 Other Resources See the Tips and Tricks for the new Certify. Watch the 4 minute introduction to the new certify. Or how to get the most out of certify with a advanced searching and features demo with the new certify. =0)document.write(unescape('%3C')+'\!-'+'-') //--

    Read the article

  • Submit Nominations for Duke's Choice Awards Latin America

    - by Tori Wieldt
    The Duke's Choice Awards are nominated by members of the Java community and recognize compelling uses of Java technology or community involvement.  The first of the regional Duke's Choice Awards will be in December in Latin America. Three winners will be announced on stage during JavaOne Latin America December 4th to 6th and in the Jan/Feb issue of Java Magazine.   Nominations are accepted from anyone in the Java community for compelling uses of Java technology or community involvement.   Duke's Choice Awards LAD judges include community members Yara Senger (Brazil) and Alexis Lopez (Colombia). In keeping with the 10 year tradition of the Duke's Choice Award program, the most important ingredient is innovation. Let's recognize and celebrate the innovation that Java delivers within Latin America! Submit your nominations now!  Nominations close 7 November. www.java.net/dukeschoiceLAD As announced at JavaOne San Francisco, the Duke's Choice Award program has been expanded to include regional awards in conjunction with each international JavaOne conference.  The expanded Duke's Choice Award program celebrates Java innovation happening within specific regions and provides an opportunity to recognize winners locally. Regions include Latin America (LAD), Europe Africa Middle East (EMEA), and Asia.  The global program will continue in association with the flagship JavaOne conference.  

    Read the article

  • Asset and Work Management in Utilities: An Integrated Enterprise Success Story

    - by stephen.slade(at)oracle.com
    Jan 11 '11 Webcast: Utilities are turning to Oracle to deliver an integrated EAM platform that manages all of their assets from fleet to facilities and distribution to generation. Hear from solutions experts and from Sunflower Electric Power Corporation about how an integrated enterprise asset and work management system helped them deliver bottom line results Do you have different work management systems for generation, distribution, and transmission? Fleet maintenance? Facilities? Are you on the latest release of these products? Have you considered your options when the product is no longer supported? Do you struggle with integration and keeping the various systems "in balance"? Do you have trouble retrieving data from these disparate systems and getting an enterprise view of asset and work management operations? Utilities are challenged to better manage information on generation, transmission and distribution assets. Point solutions for Enterprise Asset Management (EAM) and Computerized Maintenance Management Systems (CMMS) are often effective as departmental solutions but have limited ability to deliver an enterprise solution with accessible business intelligence. Date:  January 11, 2011 @ 10am PT/1pm ET EVITE:  http://www.oracle.com/us/dm/h2fy11/63025-wwmk10040611mpp054c003-se-197386.html Register: HERE

    Read the article

  • How to install Oracle Database 11g Express Edition on Ubuntu 12.10?

    - by Praneeth Pj
    I installed the Oracle database following the steps mentioned in this blog. Downloaded 11g express edition Created a new user oracle under the group dba. Following steps are executed using this. Unzipped oracle-xe-11.2.0-1.0.x86_64.rpm.zip and then converted the rpm to the Ubuntu package by running: sudo alien --scripts -d oracle-xe-11.2.0-1.0.x86_64.rpm Created /sbin/chkconfig file and added the entries as specified there. Created /etc/sysctl.d/60-oracle.conf and added the entries as specified in the same link as above. Running the commands: ln -s /usr/bin/awk /bin/awk mkdir /var/lock/subsys touch /var/lock/subsys/listener .deb generated in step 3: sudo dpkg --install oracle-xe_11.2.0-2_amd64.deb Left the default values as it is: sudo /etc/init.d/oracle-xe configure Set the following env variables in ~/.bashrc file: export ORACLE_HOME=/u01/app/oracle/product/11.2.0/xe export ORACLE_SID=XE export NLS_LANG=`$ORACLE_HOME/bin/nls_lang.sh` export ORACLE_BASE=/u01/app/oracle export LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH export PATH=$ORACLE_HOME/bin:$PATH Running the commands: chown -R oracle:dba /var/tmp/.oracle chmod -R 755 /var/tmp/.oracle chown -R oracle:dba /tmp/.oracle chmod -R 755 /tmp/.oracle Starting Oracle Database 11g Express Edition instance: sudo service oracle-xe start sqlplus / as sysdba and got the following: SQL*Plus: Release 11.2.0.2.0 Production on Thu Jan 3 09:41:58 2013 Copyright (c) 1982, 2011, Oracle. All rights reserved. Connected to an idle instance. Now when exectuting any SQL statements on SQLplus, I end up with the following error: SQL> select * from dual; select * from dual * ERROR at line 1: ORA-01034: ORACLE not available Process ID: 0 Session ID: 0 Serial number: 0 I have increased the swap memory as specified here $ free -m total used free shared buffers cached Mem: 3901 3428 473 0 182 1988 -/+ buffers/cache: 1258 2643 Swap: 5066 0 5066

    Read the article

  • Oracle Magazine: Getting started with SQL Analytics

    - by KLaker
    I am currently working on a series of podcasts covering the broad categories of our SQL analytical functions and features and while I was doing some research I came across of series of four articles in the Oracle Magazine. This series of article is written by Melanie Caffrey who is a senior development manager at Oracle. She is a coauthor of Expert PL/SQL Practices for Oracle Developers and DBAs (Apress, 2011) and Expert Oracle Practices: Oracle Database Administration from the Oak Table (Apress, 2010). The four articles are under the banner "Technology: SQL 101" and parts 9, 10, 11 and 12 cover SQL analytics. Here are the links to the four articles: Jan 2013 Having Sums, Averages, and Other Grouped Data March 2013 A Window into the World of Analytic Functions May 2013 Leading Ranks and Lagging Percentages: Analytic Functions, Continued July 2013 Pivotal Access to Your Data: Analytic Functions, Concluded The articles cover topics such as GROUP BY, SUM, AVG, HAVING, window functions, RANK, FIRST, LAST, LAG, LEAD etc.   The great news is that  you can try out the examples in this series. All you need is access to an Oracle Database instance. All the schemas, data sets and SQL statements that you will need can be downloaded from a link included in the January article.    I hope you find this series of articles useful.

    Read the article

  • Update: GTAS and EBS

    - by jeffrey.waterman
    Provided below are updated target date timeframes for provided patches for upcoming legislative enhancements.   Dates have been pushed out from previous dates provided due to changes in Treasury mandatory dates.  Mandatory dates for GTAS and IPAC have changes since previous target dates for patches were provided.   These are target dates, not commitments to deliver functionality. Deliverable Target Timeframes for Customer Patches Comments R12 GTAS Configuration Apr 2012 Patch is available GTAS Key Processes Oct/Nov 2012 Includes GTAS processes necessary to create the GTAS interface file, migration of FACTS balances to GTAS, GTAS Trial Balance, and GTAS Transaction Register. GTAS Reports Nov/Dec 2012 GTAS Trial Balance GTAS Transaction Register Capture of Trading Partner TAS/BETC Apr/May 2013 Includes modification necessary to capture BETC, Trading Partner TAS/BETC on relevant transactions. GTAS Other Processes May/Jun  2013 Includes GTAS Customer and Vendor  update processes. IPAC Aug/Sep Includes modification required to IPAC to accommodate Componentized TAS and BETC. 11i GTAS Configuration May 2012 Patch is available GTAS Key Processes Nov/Dec 2012 Includes GTAS processes necessary to create the GTAS interface file, migration of FACTS balances to GTAS, GTAS Trial Balance, and GTAS Transaction Register. GTAS Reports Dec/Jan 2012 GTAS Trial Balance GTAS Transaction Register Capture of Trading Partner TAS/BETC May/Jun 2013 Includes modification necessary to capture BETC, Trading Partner TAS/BETC on relevant transactions. GTAS Other Processes Jun/Jul 2013 Includes GTAS Customer and Vendor  update processes. IPAC Sep/Oct 2013 Includes modification required to IPAC to accommodate Componentized TAS and BETC.

    Read the article

  • Certifications in the new Certify - March 2011 Update

    - by richard.miller
    The most up-to-date certifications are now available in Certify - New Additions March 2011! What's not yet available can still be found in Classic Certify. We think that the new search will save you a ton of time and energy, so try it out and let us know. NOTE: Not all cert information is in the new system. If you type in a product name and do not find it, send us feedback so we can find the team to add it!.Also, we have been listening to every feedback message coming in. We have plans to make some improvements based on your feedback AND add the missing data. Thanks for your help!Japanese ???Note: Oracle Fusion Middleware certifications are available via oracle.com: Fusion Middleware Certifications.Certifications viewable in the new Certify SearchEnterprise PeopleTools Release 8.50, and Release 8.51 Added March 2011!Oracle DatabaseOracle Database OptionsOracle Database Clients (they apply to both 32-bit and 64-bit)Oracle BeehiveOracle Collaboration SuiteOracle E-Business Suite, Now with Release 11i & 12!Oracle Siebel Customer Relationship Management (CRM)Oracle Governance, Risk, and Compliance ManagementOracle Financial ServicesOracle HealthcareOracle Life SciencesOracle Enterprise Taxation ManagementOracle RetailOracle UtilitiesOracle Cross ApplicationsOracle PrimaveraOracle AgileOracle Transportation Management (G-L)Oracle Value Chain PlanningOracle JD Edwards EnterpriseOne (NEW! Jan 2011) 8.9+ and SP23+Oracle JD Edwards World (A7.3, A8.1, A9.1, and A9.2)Certifications viewable in Classic CertifyClassic certify is the "old" user interface. Clicking the "Classic Certify" link from Certifications > QuickLinks will take you there.Enterprise PeopleTools Release 8.49 (Coming Soon)Enterprise ManagerOther ResourcesSee the Tips and Tricks for the new Certify.Watch the 4 minute introduction to the new certify.Or how to get the most out of certify with a advanced searching and features demo with the new certify.

    Read the article

  • mdadm: breaks boot due to "is not ready yet or not present" error

    - by BarsMonster
    This is so damn frustrating :-| I've spent like 20 hours on this nice error, and seems like dozens of people over Internet too, and no clear solution yet. I have non-system RAID-5 of 5 disks, and it's fine. But during boot up it says that "/dev/md0 is not ready yet or not present" and asks to press 'S'. Very nice for Ubuntu Server - I have to bring monitor and keyboard to go next. After this system boots and it's all fine. md0 device works, /proc/mdstat is fine. When I do mount -a - it mounts this array without errors and works fine. As a dumb and shameful workaround I added noauto in /etc/fstab, and did mounting in /etc/rc.local - it works fine then. Any hints how to make it work properly? fstab: UUID=3588dfed-47ae-4c32-9855-2d69df713b86 /var/bigfatdisk ext4 noauto,noatime,data=writeback,barrier=0,nobh,commit=5 0 0 mdadm config: It is autogenerated: # mdadm.conf # # Please refer to mdadm.conf(5) for information about this file. # # by default, scan all partitions (/proc/partitions) for MD superblocks. # alternatively, specify devices to scan, using wildcards if desired. DEVICE partitions # auto-create devices with Debian standard permissions CREATE owner=root group=disk mode=0660 auto=yes # automatically tag new arrays as belonging to the local system HOMEHOST <system> # instruct the monitoring daemon where to send mail alerts MAILADDR CENSORED # definitions of existing MD arrays ARRAY /dev/md/0 metadata=1.2 bitmap=/var/md0_intent UUID=efccbeb6:a0a65cd6:470dcdf3:62781188 name=LBox2:0 # This file was auto-generated on Mon, 10 Jan 2011 04:06:55 +0200 # by mkconf 3.1.2-2

    Read the article

  • Cannot mount a CIFS network share over VPN

    - by Aron Rotteveel
    I have setup u VPN connection to our Windows 2008 server at the office and it seems to work fine. For some reason, however, I still am not able to access the network shares over a VPN connection using my standard fstab entries. When I am physically connected to the network, it works fine, but now when trying this over VPN I get the following error: mount error(110): Connection timed out Refer to the mount.cifs(8) manual page (e.g. man mount.cifs) My /etc/fstab looks like this: //server2008/share /mnt/share cifs iocharset=utf8,credentials=/home/aron/.smbcredentials,uid=1000 0 0 As said, it works fine when physically connected, but over VPN it just wont work. Any help is appreciated. EDIT: It seems the Windows firewall is making things harder on me. When I turn it off, I get a bit further, although I still get the following error message: Unable to find suitable address. The strange thing is that I have file sharing added as an exception to the firewall. Port 137-139 and port 445 are open, which should suffice, shouldn't it? EDIT Jan 20th: Still not working. When I have the firewall turned on, it times out. When I turn it off, I get the not suitable address error. Turning the firewall off is not an option, by the way.

    Read the article

  • /tmp shows 690 Mb full, actual size 72 K, Why?

    - by Ankit
    Why is /tmp diretory on my system showing 690 Mb full, whereas du -sh /tmp shows only 72K full. drwxrwxrwt 2 lightdm lightdm 4096 Aug 29 21:49 at-spi2 drwx------ 2 ankit ankit 4096 Aug 29 21:50 keyring-0JTfoY drwx------ 2 ankit ankit 4096 Aug 29 21:44 keyring-rChLLL drwx------ 2 root root 16384 Jul 22 02:10 lost+found drwx------ 2 ankit ankit 4096 Jan 1 1970 orbit-ankit drwx------ 2 lightdm lightdm 4096 Aug 29 21:50 pulse-2L9K88eMlGn7 drwx------ 2 root root 4096 Aug 29 21:44 pulse-PKdhtXMmr18n drwx------ 2 ankit ankit 4096 Aug 29 21:50 pulse-zR1TZUAZfmQW drwx------ 2 ankit ankit 4096 Aug 29 21:44 ssh-dlslOXOq2203 drwx------ 2 ankit ankit 4096 Aug 29 21:50 ssh-MrQQVRyy3316 -rw------- 1 ankit ankit 0 Aug 29 21:45 tmp0qnNG4 -rw------- 1 ankit ankit 0 Aug 29 21:50 tmpVvSMt6 -rw------- 1 ankit ankit 0 Aug 29 21:49 tmpy9Gadz -rw-rw-r-- 1 lightdm lightdm 0 Aug 29 21:44 unity_support_test.0 ankit@duster:/tmp$ df -h df: `/home/ankit/.gvfs': Transport endpoint is not connected Filesystem Size Used Avail Use% Mounted on /dev/sda1 79G 11G 65G 14% / udev 2.9G 4.0K 2.9G 1% /dev tmpfs 1.2G 868K 1.2G 1% /run none 5.0M 0 5.0M 0% /run/lock none 2.9G 220K 2.9G 1% /run/shm /dev/sda7 38G 690M 35G 2% /tmp /dev/sda5 93G 26G 63G 30% /home /dev/sda6 93G 1.6G 87G 2% /boot /dev/sda3 154G 69G 78G 48% /home/mount_150 ankit@duster:/tmp$ ankit@duster:/tmp$ ankit@duster:/tmp$ sudo du -sh /tmp/ 72K

    Read the article

  • JavaScript Class Patterns Revisited: Endgame

    - by Liam McLennan
    I recently described some of the patterns used to simulate classes (types) in JavaScript. But I missed the best pattern of them all. I described a pattern I called constructor function with a prototype that looks like this: function Person(name, age) { this.name = name; this.age = age; } Person.prototype = { toString: function() { return this.name + " is " + this.age + " years old."; } }; var john = new Person("John Galt", 50); console.log(john.toString()); and I mentioned that the problem with this pattern is that it does not provide any encapsulation, that is, it does not allow private variables. Jan Van Ryswyck recently posted the solution, obvious in hindsight, of wrapping the constructor function in another function, thereby allowing private variables through closure. The above example becomes: var Person = (function() { // private variables go here var name,age; function constructor(n, a) { name = n; age = a; } constructor.prototype = { toString: function() { return name + " is " + age + " years old."; } }; return constructor; })(); var john = new Person("John Galt", 50); console.log(john.toString()); Now we have prototypal inheritance and encapsulation. The important thing to understand is that the constructor, and the toString function both have access to the name and age private variables because they are in an outer scope and they become part of the closure.

    Read the article

  • 10.10 boots to command line login prompt

    - by greggory.hz
    I recently installed Ubuntu 10.10 on a computer that was previously running 10.04 (that worked fine). Now, each time I boot up, it starts up in a command line login prompt. I can login and it stays at the command line (as expected). I can then manually start gdm with sudo start gdm and it works fine. I can also enable compiz (using proprietary nvidia drivers) so I'm reasonably confident that it's not a driver problem (at least not in the sense that the drivers just flat out aren't working). Interestingly, if I leave it at the command prompt without logging in, after about 5 or 10 minutes, gnome starts up on its own. I'm not sure what is causing this. This is what dmesg | tail gives me after a manual start of gdm: [ 15.664166] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 270.18 Tue Jan 18 21:46:26 PST 2011 [ 15.991304] type=1400 audit(1297543976.953:11): apparmor="STATUS" operation="profile_load" name="/usr/share/gdm/guest-session/Xsession" pid=990 comm="apparmor_parser" [ 16.606986] eth0: link up, 100Mbps, full-duplex, lpa 0xCDE1 [ 18.798506] EXT4-fs (sda1): re-mounted. Opts: errors=remount-ro,commit=0 [ 26.740010] eth0: no IPv6 routers present [ 90.444593] EXT4-fs (sda1): re-mounted. Opts: errors=remount-ro,commit=0 [ 189.252208] audit_printk_skb: 21 callbacks suppressed [ 189.252213] type=1400 audit(1297544150.218:19): apparmor="STATUS" operation="profile_replace" name="/usr/lib/cups/backend/cups-pdf" pid=1876 comm="apparmor_parser" [ 189.252584] type=1400 audit(1297544150.218:20): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/cupsd" pid=1876 comm="apparmor_parser" [ 351.159585] lo: Disabled Privacy Extensions

    Read the article

  • Slides, Materials, and Pictures from SharePoint Saturday Virginia Beach 2011

    - by Brian Jackett
    This past weekend I presented “Managing SharePoint 2010 Farms with PowerShell” and “SharePoint 2010 and Integrating Line of Business Applications” SharePoint Saturday Virginia Beach.  A big thanks to everyone who attended my sessions.  I had a great time presenting, getting to meet new folks, and exploring a little bit of the local life.  Below are slides, materials, and pictures from the event.  Let me know if you have any comments, questions, or feedback.  Thanks. Slides and Materials     Managing SharePoint 2010 Farms with PowerShell     SharePoint 2010 and Integrating Line of Business Applications Photos Pictures on Facebook     Click Here Pictures on Windows Live (higher res) SharePoint Saturday Virginia Beach Jan 2011 VIEW SLIDE SHOW DOWNLOAD ALL   Side Note: SavePSToSP CodePlex Project     During my “Managing SharePoint 2010 Farms with PowerShell” I made mention of a CodePlex project I am working on called SavePSToSP.  Click here for the link to that project.  I have been pushing out updates roughly once a month or more.  If you have any feedback or find it helpful feel free to let me know.         -Frog Out

    Read the article

  • recent unreliable wireless connection on 10.04 and 10.10

    - by gabkdlly
    Recently, my internet connection over wireless has become unreliable, on both a Dell laptop running Ubuntu 10.04 as well as my Desktop running Ubuntu 10.10 . The problem does not seem to occur on a laptop running Windows Vista. The problem does not seem to occur on my Openmoko Freerunner ( running Android 1.5 ), though I hardly ever use this device to connect over WLAN, so the problem may have just slipped by. This problem does not seem to appear when I boot into Ubuntu 9.10 from a live CD ( more precisely, I was able to ping fu-berlin.de for an hour without any packet loss ). Under Ubuntu 10.10, I am experiencing about 33% packet loss. On my main Ubuntu Desktop, I have tried the following wireless devices: a Longshine PCI card ( an old device with an RTL8180L chip ) a D-Link DWL-510 PCI card ( this device threw warnings in dmesg ) a USB device from MSI ( US54EX ). Usually my wireless network shows up in the network manager with a normal signal strength, even when the connection speed is slow ( which happens often ) or the connection gets reset ( asking me to click connect to re-authenticate my wireless connection ). I have observed this problem with a Netgear KWGR614 Router ( with the manufacturers firmware ), as well as with a TP-LINK TL-WR741ND router running OpenWrt. Taking a look at my routers logs, I find many instances of the following line: Tuesday,04 Jan 2011 03:53:01 [TCP SYN Flood][Deny access policy matched, dropping packet] I know that the Netgear router is susceptible to denial of service attacks, as I have previously been able to disrupt its operation by putting an nmap scan into a while loop. I use WEP on the Netgear router and WPA on the TP-LINK to encrypt the wireless connections. Is it possible that someone is jamming my signal ?

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >