Search Results

Search found 74197 results on 2968 pages for 'part time'.

Page 61/2968 | < Previous Page | 57 58 59 60 61 62 63 64 65 66 67 68  | Next Page >

  • Modifying Contiguous Time Periods in a History Table

    Alex Kuznetsov is credited with a clever technique for creating a history table for SQL that is designed to store contiguous time periods and check that these time periods really are contiguous, using nothing but constraints. This is now increasingly useful with the DATE data type in SQL Server. The modification of data in this type of table isn't always entirely intuitive so Alex is on hand to give a brief explanation of how to do it.

    Read the article

  • A Real-Time HPC Approach for Optimizing Multicore Architectures

    Complex math is at the heart of many of the biggest technical challenges. With multicore processors, the type of calculations that would have required a supercomputer can now be performed in real-time, embedded environments. High-performance computing - Supercomputer - Real-time computing - Operating system - Companies

    Read the article

  • User Produtivity Kit - Powerful Packages (Part 1)

    - by [email protected]
    User Productivity Kit provides the ability to create a variety of content types including robust topics on system process and web pages with formatted text and graphics. There are times when you want to enhance content with media types not naively created by User Productivity Kit, media types such as video, custom animations, forms, and more. One method of doing this is to maintain these media files on a web server - separate from the User Productivity Kit player content and link to the files using absolute URLs such as http://myserver/overview.html. While this will get you going, you won't benefit from the content management capabilities of the UPK Developer. Features such as check-in / check-out, history, document properties, folder permissions and more are not available to this external content. Further, if you ever need to move that content to a server with a different name or domain, you'd need to update all your links. UPK version 3.1 introduced a new document type - the package. A package is a group of folders and files that you manage in the Developer library as a single document. These package documents work in the same manner as any other document in the library and you can use all of the collaborative content development features you see with other document types. Packages can be used for anything from single Word documents, PDF files, and graphics to more intricate sets of inter-related files commonly seen with HTML files and their graphics, style sheets, and JavaScript files. The structure of the files and folders within a package will always be preserved so this means that any relative links between files in the package will work. For example, an HTML file containing an image tag with a relative link to a graphic elsewhere in the same package will continue to function properly both when viewed in the Developer and when published to outputs such as the UPK Player. Once you start to use packages, you'll soon discover that there is a lot of existing content that can be re-purposed by placing it into UPK packages. Packages are easily created by selecting File...New...Package. Files can be added in a number of ways including the "Add Files" button, copy & paste from Windows Explorer, and drag & drop. To use one of the files in the package, just create a link to the file in the package you want to target. This is supported throughout the Developer in places such as section & topic concepts, frame links and hyperlinks in web pages. A little more challenging is determining how to structure packages in your library. As I mentioned earlier, a package can contain anything from a single file to dozens of files and folders. So what should you do? You could create a package for each file. You could create one package for all your files. But which one is right? Well, there's not a right and wrong answer to this question. There are advantages and disadvantages to each. The right decision will be influenced by the package files themselves, the structure of the content in the library, the size and working style of the development team, how content is shared between different outlines and more. The first consideration can be assessed the quickest. If the content to be placed in the package is composed of multiple files and those files reference each other, they should be in the same package. There are loads of examples of this type of content. HTML files with graphics and style sheets, HTML files with embedded Flash movies, and Word documents saved as HTML are all examples where the content is composed of multiple files and the files reference each other in some way. Content like this should always be placed in a singe package such that these relative links between the files are preserved and play properly in the UPK Player. In upcoming posts, I'll explain additional considerations.

    Read the article

  • Building a Store Locator ASP.NET Application Using Google Maps API (Part 1)

    Over the past couple of months I've been working on a couple of projects that have used the free <a href="http://code.google.com/apis/maps/">Google Maps API</a> to add interactive maps and <a href="http://en.wikipedia.org/wiki/Geocoding">geocoding</a> capabilities to ASP.NET websites. In a nutshell, the Google Maps API allow you to display maps on your website, to add markers onto the map, and to compute the latitude and longitude of an address, among many other tasks.With some Google Maps API experience under my belt, I decided it would be fun to implement a store locator feature and share it here on 4Guys. A store locator lets a visitor enter an address or postal code and then shows the nearby stores. Typically, store locators display the

    Read the article

  • Silverlight: IronPython in the browser Part 2

    This article is an excerpt taken from IronPython in Action, by Michael J. Foord and Christian Muirhead and published by Manning Publications. In the article the authors discuss how to take advantage of the exciting Silverlight platform with IronPython.

    Read the article

  • Website Optimisation - The Impact of Blended Real-time Search So Far

    Despite the initial hype surrounding the introduction of blended real-time search into internet search engines, many experts have begun to question its value to website optimisation. Real time search has been widely criticised as a cause of SERP clutter, making pages appears chaotic and leaving the user struggling to decide which links may actually provide the information they are after.

    Read the article

  • Rawr Code Clone Analysis&ndash;Part 0

    - by Dylan Smith
    Code Clone Analysis is a cool new feature in Visual Studio 11 (vNext).  It analyzes all the code in your solution and attempts to identify blocks of code that are similar, and thus candidates for refactoring to eliminate the duplication.  The power lies in the fact that the blocks of code don't need to be identical for Code Clone to identify them, it will report Exact, Strong, Medium and Weak matches indicating how similar the blocks of code in question are.   People that know me know that I'm anal enthusiastic about both writing clean code, and taking old crappy code and making it suck less. So the possibilities for this feature have me pretty excited if it works well - and thats a big if that I'm hoping to explore over the next few blog posts. I'm going to grab the Rawr source code from CodePlex (a World Of Warcraft gear calculator engine program), run Code Clone Analysis against it, then go through the results one-by-one and refactor where appropriate blogging along the way.  My goals with this blog series are twofold: Evaluate and demonstrate Code Clone Analysis Provide some concrete examples of refactoring code to eliminate duplication and improve the code-base Here are the initial results:   Code Clone Analysis has found: 129 Exact Matches 201 Strong Matches 300 Medium Matches 193 Weak Matches Also indicated is that there was a total of 45,181 potentially duplicated lines of code that could be eliminated through refactoring.  Considering the entire solution only has 109,763 lines of code, if true, the duplicates lines of code number is pretty significant. In the next post we’ll start examining some of the individual results and determine if they really do indicate a potential refactoring.

    Read the article

  • SAF Evaluation part II the Formal Methods

    OnI talked about evaluating a candidate architecture in code. This post is dedicated to evaluation on paper.I remember one system I was working on, I was keen on making the architecture asynchronous and message oriented (it was all circa 2001 by the way) However, I was new on the team and my role (as the [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • T-SQL User-Defined Functions: the good, the bad, and the ugly (part 4)

    - by Hugo Kornelis
    Scalar user-defined functions are bad for performance. I already showed that for T-SQL scalar user-defined functions without and with data access, and for most CLR scalar user-defined functions without data access , and in this blog post I will show that CLR scalar user-defined functions with data access fit into that picture. First attempt Sticking to my simplistic example of finding the triple of an integer value by reading it from a pre-populated lookup table and following the standard recommendations...(read more)

    Read the article

  • WIF, ADFS 2 and WCF&ndash;Part 5: Service Client (more Flexibility with WSTrustChannelFactory)

    - by Your DisplayName here!
    See the previous posts first. WIF includes an API to manually request tokens from a token service. This gives you more control over the request and more flexibility since you can use your own token caching scheme instead of being bound to the channel object lifetime. The API is straightforward. You first request a token from the STS and then use that token to create a channel to the relying party service. I’d recommend using the WS-Trust bindings that ship with WIF to talk to ADFS 2 – they are pre-configured to match the binding configuration of the ADFS 2 endpoints. The following code requests a token for a WCF service from ADFS 2: private static SecurityToken GetToken() {     // Windows authentication over transport security     var factory = new WSTrustChannelFactory(         new WindowsWSTrustBinding(SecurityMode.Transport),         stsEndpoint);     factory.TrustVersion = TrustVersion.WSTrust13;       var rst = new RequestSecurityToken     {         RequestType = RequestTypes.Issue,         AppliesTo = new EndpointAddress(svcEndpoint),         KeyType = KeyTypes.Symmetric     };       var channel = factory.CreateChannel();     return channel.Issue(rst); } Afterwards, the returned token can be used to create a channel to the service. Again WIF has some helper methods here that make this very easy: private static void CallService(SecurityToken token) {     // create binding and turn off sessions     var binding = new WS2007FederationHttpBinding(         WSFederationHttpSecurityMode.TransportWithMessageCredential);     binding.Security.Message.EstablishSecurityContext = false;       // create factory and enable WIF plumbing     var factory = new ChannelFactory<IService>(binding, new EndpointAddress(svcEndpoint));     factory.ConfigureChannelFactory<IService>();       // turn off CardSpace - we already have the token     factory.Credentials.SupportInteractive = false;       var channel = factory.CreateChannelWithIssuedToken<IService>(token);       channel.GetClaims().ForEach(c =>         Console.WriteLine("{0}\n {1}\n  {2} ({3})\n",             c.ClaimType,             c.Value,             c.Issuer,             c.OriginalIssuer)); } Why is this approach more flexible? Well – some don’t like the configuration voodoo. That’s a valid reason for using the manual approach. You also get more control over the token request itself since you have full control over the RST message that gets send to the STS. One common parameter that you may want to set yourself is the appliesTo value. When you use the automatic token support in the WCF federation binding, the appliesTo is always the physical service address. This means in turn that this address will be used as the audience URI value in the SAML token. Well – this in turn means that when you have an application that consists of multiple services, you always have to configure all physical endpoint URLs in ADFS 2 and in the WIF configuration of the service(s). Having control over the appliesTo allows you to use more symbolic realm names, e.g. the base address or a completely logical name. Since the URL is never de-referenced you have some degree of freedom here. In the next post we will look at the necessary code to request multiple tokens in a call chain. This is a common scenario when you first have to acquire a token from an identity provider and have to send that on to a federation gateway or Resource STS. Stay tuned.

    Read the article

  • Portal and Content - Components, part 3 – Applied Customization Framework (4 of 7)

    - by Stefan Krantz
    Have you ever been challenged with the situation where your work task asks you to implement functionality in the WebCenter Portal and you browse through the Resource Catalog (Business Dictionary) and find the functionality you need. However when you get started there is small short comings and you ask your self- how can I re-use what is out of the box ca?- I wonder what code I need to use to produce the similar functions and include my new requirements?- Must I write a new taskflow? The answer to above questions are in many times answered with simply you can  do a taskflow customization to out-of-the-box taskflows. In this post I will help you understand how to do such customization. Best described is a 4 step process, see image flow below for illustration: Just to clarify few naming confusions that might occur when go through above process. Customization Role is a function within JDeveloper that will allow you to implement view and flow customizations to existing taskflows WebCenter Portal – Spaces Taskflow Customization Framework this technology scope do not only refer to WebCenter Spaces, this also include WebCenter Portal/Framework A taskflow customization do not overwrite or replace any code, it just creates an additional tip view of the taskflow in the MDS for the current application (WebCenter Portal or WebCenter Spaces) To sum up this simple procedure I also like to help you find your way around the main topic for this post series, this post series is focusing primarily on Content integration with WebCenter Portal, so where can I find content related taskflows in the WebCenter Libraries. The list below mention some useful locations to taskflows and each taskflow page fragments. Library Reference - WebCenter Document Library Service View Content Presenter Path: oracle.webcenter.doclib.view.jsf.taskflows.presenterTaskflow: contentPresenter.xml - The Content Presenter taskflowTaskflow: contentPresenterWizard.xml - The publishing wizard to select content, select template and preview including contributionDocument Manager Path: oracle.webcenter.doclib.view.jsf.taskflows.docManager Taskflow: documentManager.xml - The Document Manager taskflow which includes references to document management feature including browsing, download, uploading and viewing. For more information on Taskflow customizations please see following documentation:http://docs.oracle.com/cd/E23943_01/webcenter.1111/e10148/jpsdg_taskflows.htm#BACIEGJD

    Read the article

  • OBIA on Teradata - Part 2 Teradata DB Utilization for ETL

    - by Mohan Ramanuja
    Techniques to Monitor Queries and ETL Load CPU and Disk I/OSelect username, processor, sum(cputime), sum(diskio) from dbc.ampusage where processor ='1-0' order by 2,3 descgroup by 1,2;UserName    Vproc    Sum(CpuTime)    Sum(DiskIO)AC00916        10    6.71            24975 List Hardware ErrorsThere is a possibility that the system might have adequate disk space but out of free cylinders. In order to monitor hardware errors, the following query was used:Select * from dbc.Software_Event_Log where Text like '%restart%' order by thedate, thetime;For active users, usage of CPU and analysis of bad CPU to I/O ratiosSelect * from DBC.AMPUSAGE where username='CRMSTGC_DEV_ID';  AND SUBSTR(ACCOUNTNAME,6,3)='006'; Usage By I/OSelect AccountName, UserName, sum(CpuTime), sum(DiskIO)  from DBC.AMPUSAGE group by AccountName, UserName Order by Sum(DiskIO) desc; AccountName                       UserName                          Sum(CpuTime)  Sum(DiskIO)$M1$10062209                      AB89487                           374628.612    7821847$M1$10062210                      AB89487                           186692.244    2799412$M1$10062213                      COC_ETL_ID                        119531.068    331100426$M1$10062200                      AB63472                           118973.316    109881984$M1$10062204                      AB63472                           110825.356    94666986$M1$10062201                      AB63472                           110797.976    75016994$M1$10062202                      AC06936                           100924.448    407839702$M1$10062204                      AB67963                           0         4$M1$10062207                      AB91990                           0         2$M1$10062208                      AB63461                           0         24$M1$10062211                      AB84332                           0         6$M1$10062214                      AB65484                           0         8$M1$10062205                      AB77529                           0         58$M1$10062210                      AC04768                           0         36$M1$10062206                      AB54940                           0         22 Usage By CPUSelect AccountName, UserName, sum(CpuTime), sum(DiskIO)  from DBC.AMPUSAGE group by AccountName, UserName Order by Sum(CpuTime) desc;AccountName                       UserName                          Sum(CpuTime)  Sum(DiskIO)$M1$10062209                      AB89487                           374628.612    7821847$M1$10062210                      AB89487                           186692.244    2799412$M1$10062213                      COC_ETL_ID                        119531.068    331100426$M1$10062200                      AB63472                           118973.316    109881984$M1$10062204                      AB63472                           110825.356    94666986$M1$10062201                      AB63472                           110797.976    75016994$M2$100622105813004760047LOAD     T23_ETLPROC_ENT                   0 6$M1$10062215                      AA37720                           0     180$M1$10062209                      AB81670                           0     6Select count(distinct vproc) from dbc.ampusage;432select * from dbc.dbcinfo;AccountName     UserName     CpuTime DiskIO  CpuTimeNorm         Vproc VprocType    Model$M1$10062205                      CRM_STGC_DEV_ID                   0.32    1764    12.7423999023438    0     AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.28    1730    11.1495999145508    3     AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.304    1736    12.1052799072266    4    AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.248    1731    9.87535992431641    7    AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.332    1731    13.2202398986816    8    AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.284    1712    11.3088799133301    11   AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.24    1757    9.55679992675781    12    AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.292    1737    11.6274399108887    15   AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.268    1753    10.6717599182129    16   AMP      2580$M1$10062205                      CRM_STGC_DEV_ID                   0.276    1732    10.9903199157715    19   AMP      2580select * from dbc.dbcinfo;InfoKey    InfoDataLANGUAGE   SUPPORT           MODE    StandardRELEASE    12.00.03.03VERSION    12.00.03.01a

    Read the article

  • Gauging Maturity of your BPM Strategy - part 2 / 2

    - by Sanjeev Sharma
    In my earlier post I had discussed the essence of maturity assessment and the business imperative for doing the same in the context of BPM. In this post I will discuss Oracle’s BPM Maturity assessment methodology. Oracle’s BPM Maturity model comprises of the following components: Maturity – represents stages of evolution of your BPM capability with 0 being the lowest level and 5 being the highest level  Domain – represents multiple perspectives both technical and business oriented against which your BPM capability can be assessed Adoption – represents scale of BPM rollout starting at the project level to the enterprise level Note: Your BPM capability can be at different levels of maturity for the different domains. Oracle’s BPM assessment methodology measures the maturity of your BPM capability at the individual domain level as well as the aggregate level. The output of Oracle’s BPM assessment benefits you in two ways: Gap Analysis by comparing the “As-Is” BPM capability with the desired “To-Be” BPM capability along the various domains  (see Figure 1) Systematic Adoption by aligning evolution of BPM capability with its rollout in multiple phases (see Figure 2)

    Read the article

  • A Deep Dive into Transport Queues (Part 2)

    Johan Veldhuis completes his 'Deep Dive' by plunging even deeper into the mysteries of MS Exchange's Transport queues that are used to temporarily store messages which are waiting until they are passed through to the next stage, and explains how to change the way they work via configuration settings.

    Read the article

  • Managing Social Relationships for the Enterprise – Part 1

    - by Michael Hylton
    By Reggie Bradford, Senior Vice President, Oracle  Today, Mark Hurd, President of Oracle, Thomas Kurian, Executive Vice President of Oracle and I discussed the strategic importance of how social media is impacting the enterprise and how it is changing the way customers, prospects employees and investors interact with brands worldwide.  Oracle understands that the consumer is in control and as such, brands must evolve and change to meet growing needs. In addition, according to social media thought leader and Analyst from Altimeter Group, Jeremiah Owyang, companies now average 178 corporate-owned social media accounts. When Oracle added leading social marketing, listening analytics and development tools from Vitrue, Collective Intellect and Involver to its Oracle’s Cloud Services Suite we went beyond providing a single set of tools. We developed an entire framework to include a comprehensive social relationship management suite to help companies move beyond the social enterprise and achieve the social-enabled enterprise.  The fundamental shift from transaction to engagement means that enterprises need not only a social strategy, but should also ensure that the information and data received from social initiatives flow back to marketing, sales, support and service. Doing so enables companies to deliver a proactive and compelling experience and provides analytics to turn engagement into opportunity – and ultimately that opportunity into revenue.  On September 13, 2012, I am delighted to sit down with Jeremiah to further the discussion about how enterprises are addressing social media strategies and managing content.  In addition, we will be taking your questions after the webinar via Twitter (@Oracle, @ReggieBradford, @cfinn, @jowyang). Use #oracle and #socbiz to submit questions and follow the conversation. I look forward to speaking with you and answering your questions online.  For more information about becoming a social-enabled enterprise, visit www.oracle.com/social. And don’t miss the insights of other social business thought leaders at www.oracle.com/goto/socialbusiness.

    Read the article

  • Orchestrating the Virtual Enterprise, Part I

    - by Kathryn Perry
    A guest post by Jon Chorley, Oracle's Chief Sustainability Officer & Vice President, SCM Product Strategy During the American Industrial Revolution, the Ford Motor Company did it all. It turned raw materials into a showroom full of Model Ts. It owned a steel mill, a glass factory, and an automobile assembly line. The company was both self-sufficient and innovative and went on to become one of the largest and most profitable companies in the world. Nowadays, it's unusual for any business to follow this vertical integration model because its much harder to be best in class across such a wide a range of capabilities and services. Instead, businesses focus on their core competencies and outsource other business functions to specialized suppliers. They exchange vertical integration for collaboration. When done well, all parties benefit from this arrangement and the collaboration leads to the creation of an agile, lean and successful "virtual enterprise." Case in point: For Sun hardware, Oracle outsources most of its manufacturing and all of its logistics to third parties. These are vital activities, but ones where Oracle doesn't have a core competency, so we shift them to business partners who do. Within our enterprise, we always retain the core functions of product development, support, and most of the sales function, because that's what constitutes our core value to our customers. This is a perfect example of a virtual enterprise.  What are the implications of this? It means that we must exchange direct internal control for indirect external collaboration. This fundamentally changes the relative importance of different business processes, the boundaries of security and information sharing, and the relationship of the supply chain systems to the ERP. The challenge is that the systems required to support this virtual paradigm are still mired in "island enterprise" thinking. But help is at hand. Developments such as the Web, social networks, collaboration, and rules-based orchestration offer great potential to fundamentally re-architect supply chain systems to better support the virtual enterprise.  Supply Chain Management Systems in a Virtual Enterprise Historically enterprise software was constructed to automate the ERP - and then the supply chain systems extended the ERP. They were joined at the hip. In virtual enterprises, the supply chain system needs to be ERP agnostic, sitting above each of the ERPs that are distributed across the virtual enterprise - most of which are operating in other businesses. This is vital so that the supply chain system can manage the flow of material and the related information through the multiple enterprises. It has to have strong collaboration tools. It needs to be highly flexible. Users need to be able to see information that's coming from multiple sources and be able to react and respond to events across those sources.  Oracle Fusion Distributed Order Orchestration (DOO) is a perfect example of a supply chain system designed to operate in this virtual way. DOO embraces the idea that a company's fulfillment challenge is a distributed, multi-enterprise problem. It enables users to manage the process and the trading partners in a uniform way and deliver a consistent user experience while operating over a heterogeneous, virtual enterprise. This is a fundamental shift at the core of managing supply chains. It forces virtual enterprises to think architecturally about how best to construct their supply chain systems. In my next post, I will share examples of companies that have made that shift and talk more about the distributed orchestration process.

    Read the article

  • Adding new events to be handled by script part without recompilation of c++ source code

    - by paul424
    As in title I want to write an Event system with handling methods written in external script language, that is Angelscript. The AS methods would have acess to game's world modifing API ( which has to be regsitered for Angelscript Machine as the game starts) . I have come to this problem : if we use the Angelsript for rapid prototyping of the game's world behavior , we would like to be able to define new Event Types, without recompiling the main binary. Is that ever possible , don't stick to Angelscript, I think about any possible scripting language. The main thought is this : If there's some really new Event possible to be constructed , we must monitor some information coming from the binary, for example some variable value changing and the Events are just triggered on some conditions on those changes , wer might be monitoring the Creatures HP, by having a function call on each change, there we could define new Events such as Creature hurt, creature killed, creature healed , creature killed and tormented to pieces ( like geting HP very much below 0 ) . Are there any other ideas of constructing new Events at the scripting side ?

    Read the article

  • T-SQL User-Defined Functions: the good, the bad, and the ugly (part 1)

    - by Hugo Kornelis
    So you thought that encapsulating code in user-defined functions for easy reuse is a good idea? Think again! SQL Server supports three types of user-defined functions. Only one of them qualifies as good. The other two – well, the title says it all, doesn’t it? The bad: scalar functions A scalar user-defined function (UDF) is very much like a stored procedure, except that it always returns a single value of a predefined data type – and because of that property, it isn’t invoked with an EXECUTE statement,...(read more)

    Read the article

  • Implementing Cluster Continuous Replication, Part 3

    Cluster continuous replication (CCR) uses log shipping and failover to provide a more resilient email system with faster recovery. Once it is installed, a clustered server requires different management routines. These are done either with a GUI tool, The Failover Cluster Management Console, or the Exchange Management Shell. You can use Powershell as well for some tasks. Confused? Not for long, since Brien Posey is once more here to help.

    Read the article

  • Raw Materials - Og, Sumerian DBA, Part 2

    A disruptive innovation raises an old, old question. Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

  • PostgreSQL, Ubuntu, NetBeans IDE (Part 3)

    - by Geertjan
    To complete the picture, let's use the traditional (that is, old) Hibernate mechanism, i.e., via XML files, rather than via the annotations shown yesterday. It's definitely trickier, with many more places where typos can occur, but that's why it's the old mechanism. I do not recommend this approach. I recommend the approach shown yesterday. The other players in this scenario include PostgreSQL, as outlined in the previous blog entries in this series. Here's the structure of the module, replacing the code shown yesterday: Here's the Employee class, notice that it has no annotations: import java.io.Serializable; import java.util.Date; public class Employees implements Serializable {         private int employeeId;     private String firstName;     private String lastName;     private Date dateOfBirth;     private String phoneNumber;     private String junk;     public int getEmployeeId() {         return employeeId;     }     public void setEmployeeId(int employeeId) {         this.employeeId = employeeId;     }     public String getFirstName() {         return firstName;     }     public void setFirstName(String firstName) {         this.firstName = firstName;     }     public String getLastName() {         return lastName;     }     public void setLastName(String lastName) {         this.lastName = lastName;     }     public Date getDateOfBirth() {         return dateOfBirth;     }     public void setDateOfBirth(Date dateOfBirth) {         this.dateOfBirth = dateOfBirth;     }     public String getPhoneNumber() {         return phoneNumber;     }     public void setPhoneNumber(String phoneNumber) {         this.phoneNumber = phoneNumber;     }     public String getJunk() {         return junk;     }     public void setJunk(String junk) {         this.junk = junk;     } } And here's the Hibernate configuration file: <?xml version="1.0"?> <!DOCTYPE hibernate-configuration PUBLIC       "-//Hibernate/Hibernate Configuration DTD 3.0//EN"     "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd"> <hibernate-configuration>     <session-factory>         <property name="hibernate.connection.driver_class">org.postgresql.Driver</property>         <property name="hibernate.connection.url">jdbc:postgresql://localhost:5432/smithdb</property>         <property name="hibernate.connection.username">smith</property>         <property name="hibernate.connection.password">smith</property>         <property name="hibernate.connection.pool_size">1</property>         <property name="hibernate.default_schema">public"</property>         <property name="hibernate.transaction.factory_class">org.hibernate.transaction.JDBCTransactionFactory</property>         <property name="hibernate.current_session_context_class">thread</property>         <property name="hibernate.dialect">org.hibernate.dialect.PostgreSQLDialect</property>         <property name="hibernate.show_sql">true</property>         <mapping resource="org/db/viewer/employees.hbm.xml"/>     </session-factory> </hibernate-configuration> Next, the Hibernate mapping file: <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC       "-//Hibernate/Hibernate Mapping DTD 3.0//EN"       "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <hibernate-mapping>     <class name="org.db.viewer.Employees"            table="employees"            schema="public"            catalog="smithdb">         <id name="employeeId" column="employee_id" type="int">             <generator class="increment"/>         </id>         <property name="firstName" column="first_name" type="string" />         <property name="lastName" column="last_name" type="string" />         <property name="dateOfBirth" column="date_of_birth" type="date" />         <property name="phoneNumber" column="phone_number" type="string" />         <property name="junk" column="junk" type="string" />             </class>     </hibernate-mapping> Then, the HibernateUtil file, for providing access to the Hibernate SessionFactory: import java.net.URL; import org.hibernate.cfg.AnnotationConfiguration; import org.hibernate.SessionFactory; public class HibernateUtil {     private static final SessionFactory sessionFactory;         static {         try {             // Create the SessionFactory from standard (hibernate.cfg.xml)             // config file.             String res = "org/db/viewer/employees.cfg.xml";             URL myURL = Thread.currentThread().getContextClassLoader().getResource(res);             sessionFactory = new AnnotationConfiguration().configure(myURL).buildSessionFactory();         } catch (Throwable ex) {             // Log the exception.             System.err.println("Initial SessionFactory creation failed." + ex);             throw new ExceptionInInitializerError(ex);         }     }         public static SessionFactory getSessionFactory() {         return sessionFactory;     }     } Finally, the "createKeys" in the ChildFactory: @Override protected boolean createKeys(List list) {     Session session = HibernateUtil.getSessionFactory().getCurrentSession();     Transaction transac = null;     try {         transac = session.beginTransaction();         Query query = session.createQuery("from Employees");         list.addAll(query.list());     } catch (HibernateException he) {         Exceptions.printStackTrace(he);         if (transac != null){             transac.rollback();         }     } finally {         session.close();     }     return true; } Note that Constantine Drabo has a similar article here. Run the application and the result should be the same as yesterday.

    Read the article

< Previous Page | 57 58 59 60 61 62 63 64 65 66 67 68  | Next Page >