Search Results

Search found 27232 results on 1090 pages for 'mind mapping software'.

Page 48/1090 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • How to refer to enum values inside nhibernate formula mapping specification?

    - by mark
    Dear ladies and sirs. I have two entities types: RunContainer parent entity type Run child entity type Run has a property Status, which is of type RunStatus, like so: public enum RunStatus { Created, Starting, // ... } public class Run { public int ContainerId { get; private set; } // ... public RunStatus Status { get; private set; } } RunContainer has a calculated property ActiveRunCount, like so: public class RunContainer { public int Id { get; private set; } // ... public int ActiveRunCount { get; private set; } } In the mapping for the RunContainer.ActiveRunCount property, I use the formula specification like so: <property name="ActiveRunCount" formula="(select count(r.Id) from Run r where r.ContainerId = Id and r.Status = 1)"/> My problem is that I refer to the RunStatus enum values in the formula by their respective numeric value, rather than the appropriate symbolic name. Can anyone tell me how can I use the symbolic name instead? Thanks.

    Read the article

  • NHibernate mapping with optimistic-lock="version" and dynamic-update="true" is generating invalid up

    - by SteveBering
    I have an entity "Group" with an assigned ID which is added to an aggregate in order to persist it. This causes an issue because NHibernate can't tell if it is new or existing. To remedy this issue, I changed the mapping to make the Group entity use optimistic locking on a sql timestamp version column. This caused a new issue. Group has a bag of sub objects. So when NHibernate flushes a new group to the database, it first creates the Group record in the Groups table, then inserts each of the sub objects, then does an update of the Group records to update the timestamp value. However, the sql that is generated to complete the update is invalid when the mapping is both dynamic-update="true" and optimistic-lock="version". Here is the mapping: <class xmlns="urn:nhibernate-mapping-2.2" dynamic-update="true" mutable="true" optimistic-lock="version" name="Group" table="Groups"> <id name="GroupNumber" type="System.String, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"> <column name="GroupNumber" length="5" /> <generator class="assigned" /> </id> <version generated="always" name="Timestamp" type="BinaryBlob" unsaved-value="null"> <column name="TS" not-null="false" sql-type="timestamp" /> </version> <property name="UID" update="false" type="System.Guid, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"> <column name="GroupUID" unique="true" /> </property> <property name="Description" type="AnsiString"> <column name="GroupDescription" length="25" not-null="true" /> </property> <bag access="field.camelcase-underscore" cascade="all" inverse="true" lazy="true" name="Assignments" mutable="true" order-by="GroupAssignAssignment"> <key foreign-key="fk_Group_Assignments"> <column name="GroupNumber" /> </key> <one-to-many class="Assignment" /> </bag> <many-to-one class="Aggregate" name="Aggregate"> <column name="GroupParentID" not-null="true" /> </many-to-one> </class> </hibernate-mapping> When the mapping includes both the dynamic update and the optimistic lock, the sql generated is: UPDATE groups SET WHERE GroupNumber = 11111 AND TS=0x00000007877 This is obviously invalid as there are no SET statements. If I remove the dynamic update part, everything gets updated during this update statement instead. This makes the statement valid, but rather unnecessary. Has anyone seen this issue before? Am I missing something? Thanks, Steve

    Read the article

  • I need a piece software to use with my HP F4280 scanner.

    - by D Connors
    So, I got this printer/scanner about a year ago, and I'm really happy with it. The only thing is that I never really liked the HP scanning software that came with it. A few months ago I reformatted and reinstalled windows 7. Then, once I plugged in the printer, I noticed that windows recognized it automatically, and offered to install all the drivers by itself. So instead of manually installing the driver that came in the CD, I simply let windows automatically install it from its servers, and so far it's great. Instead of HP's scanning software (that really wasn't pleasing me), I got a very simplified interface that is more than enough for my ocasional scanning habits. Until today. Today I had to scan a bunch of old pictures for my father. And that simple interface felt like it was lacking quite a few features to make this repetitive task a little easier. And that's why I'm now looking for a good software to use for scanning. By "good" I mean anything well thought out, and specially anything that will make my life easier when repetitive-scanning. It doesn't need to have professional tweaking options, but having them is not a problem either. You guys got anything?

    Read the article

  • Optional one-to-one mapping in Hibernate

    - by hibernate
    How do I create an optional one-to-one mapping in the hibernate hbm file? For example, suppose that I have a User and a last_visited_page table. The user may or may not have a last_visited page. Here is my current one-to-one mapping in the hbm file: User Class: <one-to-one name="lastVisitedPage" class="LastVisitedPage" cascade="save-update"> LastVisitedPage Class: <one-to-one name="user" class="user" constrained="true" /> The above example does not allow the creation of a user who does not have a last visited page. A freshly created user has not visited any pages yet. How do I change the hbm mapping to make the userPrefs mapping optional?

    Read the article

  • nHibernate Domain Model and Mapping Files in Separate Projects

    - by Blake Blackwell
    Is there a way to separate out the domain objects and mapping files into two separate projects? I would like to create one project called MyCompany.MyProduct.Core that contains my domain model, and another project that is called MyCompany.MYProduct.Data.Oracle that contains my Oracle data mappings. However, when I try to unit test this I get the following error message: Named query 'GetClients' not found. Here is my mapping file: <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" assembly="MyCompany.MyProduct.Core" namespace="MyCompany.MyProduct.Core" > <class name="MyCompany.MyProduct.Core.Client" table="MY_CLIENT" lazy="false"> <id name="ClientId" column="ClientId"></id> <property name="ClientName" column="ClientName" /> <loader query-ref="GetClients"/> </class> <sql-query name="GetClients" callable="true"> <return class="Client" /> call procedure MyPackage.GetClients(:int_SummitGroupId) </sql-query> </hibernate-mapping> Here is my unit test: try { var cfg = new Configuration(); cfg.Configure(); cfg.AddAssembly( typeof( Client ).Assembly ); ISessionFactory sessionFactory = cfg.BuildSessionFactory(); IStatelessSession session = sessionFactory.OpenStatelessSession(); IQuery query = session.GetNamedQuery( "GetClients" ); query.SetParameter( "int_SummitGroupId", 3173 ); IList<Client> clients = query.List<Client>(); Assert.AreNotEqual( 0, clients.Count ); } catch( Exception ex ) { throw ex; } I think I may be improperly referencing the assembly, because if I do put the domain model object in the MyComapny.MyProduct.Data.Oracle class it works. Only when I separate out in to two projects do I run into this problem.

    Read the article

  • Unicenter Software Delivery 4 not able to connect to MS SQL 2000 Database after W2003 SP2 upgrade

    - by grub
    Hello Everyone Yesterday I installed the Windows Server 2003 Service Pack 2 on a Windows Server 2003 which has Unicenter Software Delivery 4 installed. Prior to the installation I disabled every CA service on the server (Brightstor, SDO , RCO, TNG) and the MS SQL 2000 service. After the installation of the SP2 I enabled the services again but the Unicenter Service is not able to connect to the MS SQL 2000 Database anymore. The database itself is up and running and I can connect to it with the Enterprise Manager. A dbcc checkdb doesnt return any errors on the Unicenter database. The Unicenter service throws the following error messages during startup: IM[1] 27/05 10:38:31,272 Installation Manager in init phase IM[1] 27/05 10:38:31,694 Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:31,694 sqls error details: IM[1] 27/05 10:38:31,694 (null) IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError C@:TaskmgrL\ASMTML.CXX:596. IM[1] 27/05 10:38:32,069 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:32,069 sqls error details: IM[1] 27/05 10:38:32,069 (null) IM[1] 27/05 10:38:32,069 returned 0. IM[1] 27/05 10:38:32,084 Persistent Storage could not be opened. Error cause is found in the ASM Event Log. Restart Task Manager. IM[1] 27/05 10:38:32,084 Failed to open database. IM[1] 27/05 10:38:32,084 Installation Manager ends> If I check the Unicenter configutation with *chkmib_l* the tool throws an exception and creates a small dump file. An Exception Occurred: Time: 27/05 09:49:38,928 Reason: ChkMIB_l.exe caused an UNKNOWN_EXCEPTION in module kernel32.dll at 7C82001B:77E4BEE7 Registers: EAX=0012F908 EBX=00000000 ECX=00000000 EDX=02410004 ESI=0012F998 EDI=0012F998 EBP=0012F958 ESP=0012F904 EIP=77E4BEE7 FLG=00000206 CS =7C82001B DS =B90023 SS =120023 ES =120023 FS =7C82003B GS =3F0000 Call Stack: 7C82001B:77E4BEE7 (0xE06D7363 0x00000001 0x00000003 0x0012F98C) kernel32.dll 7C82001B:77BB3259 (0x0012F9B8 0x2B017C50 0x2B024404 0x00B68C98) MSVCRT.dll 7C82001B:2B010C42 (0x00020003 0x010C00FE 0x003F0190 0x00B69050) PS.dll << SOFTWARE DELIVERY INSTANCE INFO >> TRIGGER 0(1) instances: JCE 0(1) instances: TM 0(1) instances: IM 0(1) instances: DM 0(1) instances: DPU 0(71) instances: NATF 0(1) instances: MIBCONV 0(0) instances: API 0(4) instances: DTSFT 0(0) instances: TNGPOP 0(0) instances: DGATE 0(0) instances: << FLUSHING MEMORY TRACES >> << STOP FLUSHING MEMORY TRACES >> I compared the configuration of the SDO service and the system configuration with another server on which the Windows Server 2003 SP2 is installed and SDO is working. The configuration is the same and the same driver and software versions are used. Do you have any idea what causes the connection issue? Should I deinstall the unicenter service and make a fresh installation on the server or should I remove the Windows Server 2003 SP2? I don't want to remove the SP2 because it's a requirement for WSUS3 SP2 and I really don't want to know how many possible exploits are possible in such an old system ;-) Thank you very much and have a nice day. Below you can find more detailed information about the system and the SDO service. psinfo output (system information) System information for \\CZZAAS1003: Uptime: 0 days 14 hours 38 minutes 50 seconds Kernel version: Microsoft Windows Server 2003, Multiprocessor Free Product type: Standard Edition Product version: 5.2 Service pack: 2 Kernel build number: 3790 Install date: 23.9.2004, 11:16:11s IE version: 6.0000 System root: C:\WINDOWS Processors: 2 Processor speed: 2.3 GHz Processor type: Intel(R) Xeon(TM) CPU Physical memory: 1024 MB Video driver: RAGE XL PCI Family (Microsoft Corporation) sdver output (Unicenter Software delivery version) Unicenter Software Delivery 4.0 SP1 I2 ENU [2901] Copyright 2004 Computer Associates International, Incorporated ms sql 2000 version and odbc driver version MS SQL 2000 Server Standard Edition Product Version: 8.00.760 (SP3) ODBC Driver: SQL Server - Version 2000.86.3959.00 complete Unicenter Software delivery service log file TRIGGER[1] 27/05 10:38:28,366 SD Trigger Agent has started NATF[1] 27/05 10:38:28,928 Initiation phase finished IM[1] 27/05 10:38:31,272 Installation Manager in init phase IM[1] 27/05 10:38:31,694 Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:31,694 sqls error details: IM[1] 27/05 10:38:31,694 (null) IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError C@:TaskmgrL\ASMTML.CXX:596. IM[1] 27/05 10:38:32,069 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:32,069 sqls error details: IM[1] 27/05 10:38:32,069 (null) IM[1] 27/05 10:38:32,069 returned 0. IM[1] 27/05 10:38:32,084 Persistent Storage could not be opened. Error cause is found in the ASM Event Log. Restart Task Manager. IM[1] 27/05 10:38:32,084 Failed to open database. IM[1] 27/05 10:38:32,084 Installation Manager ends TM[1] 27/05 10:38:32,116 Task Manager in init phase TM[1] 27/05 10:38:32,334 Process TM(L) - [006132] failed to open database SDDATA. dbopen() call failed. TM[1] 27/05 10:38:32,334 sqls error details: TM[1] 27/05 10:38:32,334 (null) TM[1] 27/05 10:38:32,381 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. TM[1] 27/05 10:38:32,381 ##EXCEPTION## TableError C@:TaskmgrL\ASMTML.CXX:596. TM[1] 27/05 10:38:32,381 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process TM(L) - [006132] failed to open database SDDATA. dbopen() call failed. TM[1] 27/05 10:38:32,381 sqls error details: TM[1] 27/05 10:38:32,381 (null) TM[1] 27/05 10:38:32,381 returned 0. TM[1] 27/05 10:38:32,381 Persistent Storage could not be opened. Error cause is found in the ASM Event Log. Restart Task Manager. TM[1] 27/05 10:38:32,381 Failed to open database. TM[1] 27/05 10:38:32,381 Task Manager ends DM[1] 27/05 10:38:33,272 Dialogue Manager is now active API[1] 27/05 10:38:34,397 API Server Process in init phase API[1] 27/05 10:38:34,397 API - SDNLS_Init API[1] 27/05 10:38:34,397 API - connectEM API[1] 27/05 10:38:34,412 API - apiServ.init DM[1] 27/05 10:38:34,678 **AND** 1 Agents triggered API[1] 27/05 10:38:34,709 Process API(L) - [005680] failed to open database SDDATA. dbopen() call failed. API[1] 27/05 10:38:34,709 sqls error details: API[1] 27/05 10:38:34,709 (null) API[1] 27/05 10:38:34,756 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. API[1] 27/05 10:38:34,756 ##EXCEPTION## TableError C@:MainAPIL\APISERVL.CXX:246. API[1] 27/05 10:38:34,756 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process API(L) - [005680] failed to open database SDDATA. dbopen() call failed. API[1] 27/05 10:38:34,756 sqls error details: API[1] 27/05 10:38:34,756 (null) API[1] 27/05 10:38:34,756 returned 0. API[1] 27/05 10:38:34,756 Open of the database failed. API[1] 27/05 10:38:34,756 API - apiServ.init complete API[1] 27/05 10:38:34,756 API - start_APIServer DM[1] 27/05 10:38:34,803 CZZAAR1037 DPU[1:CZZAAR1037] 27/05 10:38:35,772 DPU in init phase DPU[1:CZZAAR1037] 27/05 10:38:36,100 >> GetManagerData DPU[1:CZZAAR1037] 27/05 10:38:36,287 >> SetCompInfo DPU[1:CZZAAR1037] 27/05 10:38:36,334 >> GetContainerList DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6ad DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6ad DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6b7 DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6b7 DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6c1 DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6c1 DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6cb DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6cb DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6f9 DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6f9 DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b71a DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b71a DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b724 DPU[1:CZZAAR1037] 27/05 10:38:36,381 getJobState 3 from 5b724 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b72e DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b72e DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b738 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b738 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b742 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b742 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b74c DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b74c DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b756 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b756 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b78a DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b78a DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b7af DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b7af DPU[1:CZZAAR1037] 27/05 10:38:36,522 >> SetCompAttr DPU[1:CZZAAR1037] 27/05 10:38:36,569 >> SetDetected DPU[1:CZZAAR1037] 27/05 10:38:36,584 disconnect DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6ad DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6b7 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6c1 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6cb DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6f9 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b71a DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b724 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b72e DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b738 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b742 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b74c DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b756 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b78a DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b7af DPU[1:CZZAAR1037] 27/05 10:38:36,584 DPU ends DM[1] 27/05 10:38:38,006 **AND** 0 Agents triggered JCE[1] 27/05 10:38:38,053 JCE starts DM[1] 27/05 10:38:38,287 CZZAAS1003 DPU[2:CZZAAS1003] 27/05 10:38:38,412 DPU in init phase DPU[2:CZZAAS1003] 27/05 10:38:38,647 >> GetManagerData DPU[2:CZZAAS1003] 27/05 10:38:38,756 >> SetCompInfo DPU[2:CZZAAS1003] 27/05 10:38:38,787 >> GetContainerList DM[1] 27/05 10:38:38,850 **AND** 1 Agents triggered DM[1] 27/05 10:38:38,928 CZZAAR1124 DPU[3:CZZAAR1124] 27/05 10:38:39,053 DPU in init phase DPU[3:CZZAAR1124] 27/05 10:38:39,272 >> GetManagerData DM[1] 27/05 10:38:39,334 **AND** 1 Agents triggered DPU[3:CZZAAR1124] 27/05 10:38:39,381 >> SetCompInfo DPU[3:CZZAAR1124] 27/05 10:38:39,412 >> GetContainerList DM[1] 27/05 10:38:39,412 CZZAAR1125 DPU[3:CZZAAR1124] 27/05 10:38:39,428 getJobState 3 from 5b88e DPU[3:CZZAAR1124] 27/05 10:38:39,428 getJobState 3 from 5b88e DPU[2:CZZAAS1003] 27/05 10:38:39,491 >> SetCompAttr DPU[3:CZZAAR1124] 27/05 10:38:39,522 >> SetCompAttr DPU[4:CZZAAR1125] 27/05 10:38:39,522 DPU in init phase DPU[3:CZZAAR1124] 27/05 10:38:39,584 >> SetDetected DPU[2:CZZAAS1003] 27/05 10:38:39,584 >> SetDetected DPU[3:CZZAAR1124] 27/05 10:38:39,584 disconnect DPU[3:CZZAAR1124] 27/05 10:38:39,600 getJobState 3 from 5b88e DPU[3:CZZAAR1124] 27/05 10:38:39,600 DPU ends DPU[2:CZZAAS1003] 27/05 10:38:39,631 disconnect DPU[2:CZZAAS1003] 27/05 10:38:39,631 DPU ends DPU[4:CZZAAR1125] 27/05 10:38:39,756 >> GetManagerData DPU[4:CZZAAR1125] 27/05 10:38:39,850 >> SetCompInfo DPU[4:CZZAAR1125] 27/05 10:38:39,881 >> GetContainerList DPU[4:CZZAAR1125] 27/05 10:38:39,897 getJobState 3 from 5b8a9 DPU[4:CZZAAR1125] 27/05 10:38:39,897 getJobState 3 from 5b8a9 DPU[4:CZZAAR1125] 27/05 10:38:39,991 >> SetCompAttr DPU[4:CZZAAR1125] 27/05 10:38:40,100 >> SetDetected DPU[4:CZZAAR1125] 27/05 10:38:40,116 disconnect DPU[4:CZZAAR1125] 27/05 10:38:40,116 getJobState 3 from 5b8a9 DPU[4:CZZAAR1125] 27/05 10:38:40,116 DPU ends DM[1] 27/05 10:38:40,741 **AND** 0 Agents triggered JCE[1] 27/05 10:38:42,756 JCE ends DM[1] 27/05 10:38:47,475 **AND** 0 Agents triggered DM[1] 27/05 10:38:54,241 **AND** 0 Agents triggered

    Read the article

  • change custom mapping - sharp architecture/ fluent nhibernate

    - by csetzkorn
    I am using the sharp architecture which also deploys FNH. The db schema sql code is generated during the testing like this: [TestFixture] [Category("DB Tests")] public class MappingIntegrationTests { [SetUp] public virtual void SetUp() { string[] mappingAssemblies = RepositoryTestsHelper.GetMappingAssemblies(); configuration = NHibernateSession.Init( new SimpleSessionStorage(), mappingAssemblies, new AutoPersistenceModelGenerator().Generate(), "../../../../app/XXX.Web/NHibernate.config"); } [TearDown] public virtual void TearDown() { NHibernateSession.CloseAllSessions(); NHibernateSession.Reset(); } [Test] public void CanConfirmDatabaseMatchesMappings() { var allClassMetadata = NHibernateSession.GetDefaultSessionFactory().GetAllClassMetadata(); foreach (var entry in allClassMetadata) { NHibernateSession.Current.CreateCriteria(entry.Value.GetMappedClass(EntityMode.Poco)) .SetMaxResults(0).List(); } } /// <summary> /// Generates and outputs the database schema SQL to the console /// </summary> [Test] public void CanGenerateDatabaseSchema() { System.IO.TextWriter writeFile = new StreamWriter(@"d:/XXXSqlCreate.sql"); var session = NHibernateSession.GetDefaultSessionFactory().OpenSession(); new SchemaExport(configuration).Execute(true, false, false, session.Connection, writeFile); } private Configuration configuration; } I am trying to use: using FluentNHibernate.Automapping; using xxx.Core; using SharpArch.Data.NHibernate.FluentNHibernate; using FluentNHibernate.Automapping.Alterations; namespace xxx.Data.NHibernateMaps { public class x : IAutoMappingOverride<x> { public void Override(AutoMapping<Tx> mapping) { mapping.Map(x => x.text, "text").CustomSqlType("varchar(max)"); mapping.Map(x => x.url, "url").CustomSqlType("varchar(max)"); } } } To change the standard mapping of strings from NVARCHAR(255) to varchar(max). This is not picked up during the sql schema generation. I also tried: mapping.Map(x = x.text, "text").Length(100000); Any ideas? Thanks. Christian

    Read the article

  • I need scanning software to use with my scanner.

    - by D Connors
    So, I got this (HP F4280) printer/scanner about a year ago, and I'm really happy with it. The only thing is that I never really liked the HP scanning software that came with it. A few months ago I reformatted and reinstalled windows 7. Then, once I plugged in the printer, I noticed that windows recognized it automatically, and offered to install all the drivers by itself. So instead of manually installing the driver that came in the CD, I simply let windows automatically install it from its servers, and so far it's great. Instead of HP's scanning software (that really wasn't pleasing me), I got a very simplified interface that is more than enough for my ocasional scanning habits. Until today. Today I had to scan a bunch of old pictures for my father. And that simple interface felt like it was lacking quite a few features to make this repetitive task a little easier. And that's why I'm now looking for a good software to use for scanning. By "good" I mean anything well thought out, and specially anything that will make my life easier when repetitive-scanning. It doesn't need to have professional tweaking options, but having them is not a problem either. You guys got anything?

    Read the article

  • Can't Compile Correct Mapping File

    - by NoOne
    Hello, Im now developping one application in C# with Remoting objects and NHibernate. So here is a image explaining how my projects are divided. Views Layer This layer will be responsible for the users interface. This layer will always use the Controls Layer to create and edit objects; Control Layer This is my persistence layer, here Ill have all the NHibernate configuration. This is my critic point, because ListSingleton Project will have only my RemoteObject. (Here I have the App.config file) Models Layer Entity layer. Here I only have the entities classes and their respective mapping. I’ve done a test solution with no remoting and only with the projects of the Control and Model Layers. It was all working ok. Now that I added the Views Layer and set the solution to start with projects Client and Server (Client calls the Control Layer and this would do a try to persist a object) I’m getting a error at : Configuration cfg = new Configuration(); cfg.AddXmlFile("mapping/User.hbm.xml"); InnerException: {"Could not compile the mapping document: mapping/User.hbm.xml"} InnerException.InnerException: {"Could not find the dialect in the configuration"} StackTrace in the InnerException.InnerException " em NHibernate.Dialect.Dialect.GetDialect(IDictionary`2 props)\r\n em NHibernate.Cfg.Configuration.AddValidatedDocument(NamedXmlDocument doc)" But I know that there is no error in the mapping file because I used in my test application.

    Read the article

  • Hibernate - Problem in parsing mapping file (.hbm.xml)

    - by Yatendra Goel
    I am new to Hibernate. I have an exception while running an Hibernate-based application. The exception is as follows: 16 [main] INFO org.hibernate.cfg.Environment - Hibernate 3.3.2.GA 16 [main] INFO org.hibernate.cfg.Environment - hibernate.properties not found 16 [main] INFO org.hibernate.cfg.Environment - Bytecode provider name : javassist 31 [main] INFO org.hibernate.cfg.Environment - using JDK 1.4 java.sql.Timestamp handling 94 [main] INFO org.hibernate.cfg.Configuration - configuring from resource: /hibernate.cfg.xml 94 [main] INFO org.hibernate.cfg.Configuration - Configuration resource: /hibernate.cfg.xml 219 [main] INFO org.hibernate.cfg.Configuration - Reading mappings from resource : app/data/City.hbm.xml 266 [main] ERROR org.hibernate.util.XMLHelper - Error parsing XML: XML InputStream(12) Attribute "coloumn" must be declared for element type "property". 266 [main] ERROR org.hibernate.util.XMLHelper - Error parsing XML: XML InputStream(13) Attribute "coloumn" must be declared for element type "property". 266 [main] ERROR org.hibernate.util.XMLHelper - Error parsing XML: XML InputStream(14) Attribute "coloumn" must be declared for element type "property". It seems that it is not finding coloumn attribute of the property element in the mappings file but my mappings file do have the coloumn attribute. Below is the mappings file (City.hbm.xml) <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <hibernate-mapping package="app.data"> <class name="City" table="CITY"> <id column="CITY_ID" name="cityId"> <generator class="native"/> </id> <property name="cityDisplyaName" coloumn="CITY_DISPLAY_NAME" /> <property coloumn="CITY_MEANINGFUL_NAME" name="cityMeaningFulName" /> <property coloumn="CITY_URL" name="cityURL" /> </class> </hibernate-mapping>

    Read the article

  • Use SQL query to populate property in nHibernate mapping file

    - by brainimus
    I have an object which contains a property that is the result of an SQL statement. How do I add the SQL statement to my nHibernate mapping file? Example Object: public class Library{ public int BookCount { get; set; } } Example Mapping File: <hibernate-mapping> <class name="Library" table="Libraries"> <property name="BookCount" type="int"> <- This is where I want the SQL query to populate the value. -> </class> </hibernate-mapping> Example SQL Query: SELECT COUNT(*) FROM BOOKS WHERE BOOKS.LIBRARY_ID = LIBRARIES.ID

    Read the article

  • DTOs Collections mapping Problem

    - by the_knight5000
    I'm working now on a multi-tier project which has layers as following : DAL BLL GUI Layer and Shared DTOs between BLL and GUI layers. I'm facing a problem in mapping the Objects from DAO To DTO, No problem in the simple objects. The problem is in the Objects who have child collections of another objects. ex: Author Category --Categories --Authors the execution goes in an infinite loop of mapping and it get more complex when I want model Self-join tables ex: Safe Safe --TransferSafe(Collection<Safe>) --TransferSafe(Collection<Safe>) the execution goes in an infinite loop of mapping any suggestions about a good solution or a practical mapping pattern?

    Read the article

  • Required Skill Sets Of A Software Architect

    The question has been asked as to what is the required skill sets of a software architect. The answer to this is that it truly depends. When I state that it depend, it depends on the organization, industry, and skill sets available on the open market and internally within a company. With open ended skill sets even Napoleon Dynamite could be an architect. Napoleon Dynamite’s Skills Pedro: Have you asked anybody yet? Napoleon Dynamite: No, but who would? I don't even have any good skills. Pedro: What do you mean? Napoleon Dynamite: You know, like nunchuck skills, bow hunting skills, computer hacking skills... Girls only want boyfriends who have great skills. Pedro: Aren't you pretty good at drawing, like animals and warriors and stuff? This example might be a little off base but it does illustrate a point. What are the real required skills of a software architect? In my opinion, an architect needs to demonstrate the knowledge of the following three main skill set categories so that they are successful. General Skill Sets of an Architect Basic Engineering Skills Organizational  Skills Interpersonal Skills Basic Engineering Skills are a very large part of what a software architect deal with on a daily bases when designing or updating systems. Think about it, how good would a lead mechanic be if they did not know how to fix or repair cars? They would not be, and that is my point that architects need to have at least some basic skills regarding engineering. The skills listed below are generic in nature because they change from job to job, so in this discussion I am trying to focus more on generalities so that anyone can apply this information to their individual situation. Common Basic Engineering Skills Data Modeling Code Creation Configuration Testing Deployment/Publishing System and Environment Knowledge Organizational Skills If an Architect works for or with an origination then they will need strong organization skills to survive. An architect is no use to a project if the project is missed managed. Additionally, budgets and timelines can really affect a company and their products when established deadlines are repeated not meet. By not meeting these timelines a company is forced to cancel the project and waste all the money and time spent or spend more money until it is completed, if it is ever completed. Common Organizational Skills Project Management Estimation (Cost and Time) Creation and Maintenance of Accepted Standards Interpersonal Skills For me personally Interpersonal skill ranks above the other types of skill sets because an architect can quickly pick up the other two skill sets by communicating with other team/project members so that they are quickly up to speed on a project. Additionally, in order for an architect to manage a project or even derive rough estimates they will more than likely have to consult with others actually working on the code (Programmers/Software Engineers) to get there estimates since they will be the ones actually working on the changes to be implemented. Common Interpersonal Skills Good Communicator Focus on projects success over personal Honors roles within a team Reference: Taylor, R. N., Medvidovic, N., & Dashofy, E. M. (2009). Software architecture: Foundations, theory, and practice Hoboken, NJ: John Wiley & Sons

    Read the article

  • Combination of Operating Mode and Commit Strategy

    - by Kevin Yang
    If you want to populate a source into multiple targets, you may also want to ensure that every row from the source affects all targets uniformly (or separately). Let’s consider the Example Mapping below. If a row from SOURCE causes different changes in multiple targets (TARGET_1, TARGET_2 and TARGET_3), for example, it can be successfully inserted into TARGET_1 and TARGET_3, but failed to be inserted into TARGET_2, and the current Mapping Property TLO (target load order) is “TARGET_1 -> TARGET_2 -> TARGET_3”. What should Oracle Warehouse Builder do, in order to commit the appropriate data to all affected targets at the same time? If it doesn’t behave as you intended, the data could become inaccurate and possibly unusable.                                               Example Mapping In OWB, we can use Mapping Configuration Commit Strategies and Operating Modes together to achieve this kind of requirements. Below we will explore the combination of these two features and how they affect the results in the target tables Before going to the example, let’s review some of the terms we will be using (Details can be found in white paper Oracle® Warehouse Builder Data Modeling, ETL, and Data Quality Guide11g Release 2): Operating Modes: Set-Based Mode: Warehouse Builder generates a single SQL statement that processes all data and performs all operations. Row-Based Mode: Warehouse Builder generates statements that process data row by row. The select statement is in a SQL cursor. All subsequent statements are PL/SQL. Row-Based (Target Only) Mode: Warehouse Builder generates a cursor select statement and attempts to include as many operations as possible in the cursor. For each target, Warehouse Builder inserts each row into the target separately. Commit Strategies: Automatic: Warehouse Builder loads and then automatically commits data based on the mapping design. If the mapping has multiple targets, Warehouse Builder commits and rolls back each target separately and independently of other targets. Use the automatic commit when the consequences of multiple targets being loaded unequally are not great or are irrelevant. Automatic correlated: It is a specialized type of automatic commit that applies to PL/SQL mappings with multiple targets only. Warehouse Builder considers all targets collectively and commits or rolls back data uniformly across all targets. Use the correlated commit when it is important to ensure that every row in the source affects all affected targets uniformly. Manual: select manual commit control for PL/SQL mappings when you want to interject complex business logic, perform validations, or run other mappings before committing data. Combination of the commit strategy and operating mode To understand the effects of each combination of operating mode and commit strategy, I’ll illustrate using the following example Mapping. Firstly we insert 100 rows into the SOURCE table and make sure that the 99th row and 100th row have the same ID value. And then we create a unique key constraint on ID column for TARGET_2 table. So while running the example mapping, OWB tries to load all 100 rows to each of the targets. But the mapping should fail to load the 100th row to TARGET_2, because it will violate the unique key constraint of table TARGET_2. With different combinations of Commit Strategy and Operating Mode, here are the results ¦ Set-based/ Correlated Commit: Configuration of Example mapping:                                                     Result:                                                      What’s happening: A single error anywhere in the mapping triggers the rollback of all data. OWB encounters the error inserting into Target_2, it reports an error for the table and does not load the row. OWB rolls back all the rows inserted into Target_1 and does not attempt to load rows to Target_3. No rows are added to any of the target tables. ¦ Row-based/ Correlated Commit: Configuration of Example mapping:                                                   Result:                                                  What’s happening: OWB evaluates each row separately and loads it to all three targets. Loading continues in this way until OWB encounters an error loading row 100th to Target_2. OWB reports the error and does not load the row. It rolls back the row 100th previously inserted into Target_1 and does not attempt to load row 100 to Target_3. Then, if there are remaining rows, OWB will continue loading them, resuming with loading rows to Target_1. The mapping completes with 99 rows inserted into each target. ¦ Set-based/ Automatic Commit: Configuration of Example mapping: Result: What’s happening: When OWB encounters the error inserting into Target_2, it does not load any rows and reports an error for the table. It does, however, continue to insert rows into Target_3 and does not roll back the rows previously inserted into Target_1. The mapping completes with one error message for Target_2, no rows inserted into Target_2, and 100 rows inserted into Target_1 and Target_3 separately. ¦ Row-based/Automatic Commit: Configuration of Example mapping: Result: What’s happening: OWB evaluates each row separately for loading into the targets. Loading continues in this way until OWB encounters an error loading row 100 to Target_2 and reports the error. OWB does not roll back row 100th from Target_1, does insert it into Target_3. If there are remaining rows, it will continue to load them. The mapping completes with 99 rows inserted into Target_2 and 100 rows inserted into each of the other targets. Note: Automatic Correlated commit is not applicable for row-based (target only). If you design a mapping with the row-based (target only) and correlated commit combination, OWB runs the mapping but does not perform the correlated commit. In set-based mode, correlated commit may impact the size of your rollback segments. Space for rollback segments may be a concern when you merge data (insert/update or update/insert). Correlated commit operates transparently with PL/SQL bulk processing code. The correlated commit strategy is not available for mappings run in any mode that are configured for Partition Exchange Loading or that include a Queue, Match Merge, or Table Function operator. If you want to practice in your own environment, you can follow the steps: 1. Import the MDL file: commit_operating_mode.mdl 2. Fix the location for oracle module ORCL and deploy all tables under it. 3. Insert sample records into SOURCE table, using below plsql code: begin     for i in 1..99     loop         insert into source values(i, 'col_'||i);     end loop;     insert into source values(99, 'col_99'); end; 4. Configure MAPPING_1 to any combinations of operating mode and commit strategy you want to test. And make sure feature TLO of mapping is open. 5. Deploy Mapping “MAPPING_1”. 6. Run the mapping and check the result.

    Read the article

  • Commercial product using a GPL OS

    - by pfried
    we are planning to create a commercial product. The product consists of come MCUs and a small computer (we are developping on a raspberry pi at the moment). The computer needs an operating system as we would like keep things like WLAN and booting as simple as possible. We create some software running on this computer (node.js application). The most operating systems like Arch Linux are licenced under the GPL. The product we would sell contains the computer with preinstalled OS and software. This system operates as a central access point to MCU devices and is able to control them. We use other's software in our product. We do not modify their source code. The product (the computer part) consists of a computer, an OS and software we create. How does the use of an OS affect our own code (licence)? Is there a possibility of avoiding GPL for our own code? eg. shipping the software seperated? Are there any effects to other components of our product, eg. the MCU part? The node.js application delivers a WebApp to the client where it is executed. Are there any effects (As we would like to sell parts of the code as an additional App on the App Stores)? I know we make use of the work of the community and i respect this. The problem is: The software alone is kind of useless without the MCU devices. I do not expect a legal advice.

    Read the article

  • Where to start a software Analysis & Designing?

    - by Muneer
    I am starting a big database oriented software to develop. I have full picture of the software in mind. I need to do the designing using UML. As there are various tools in UML such as usecase, class diagram, statechart, component diagram, deployment diagram, activity diagram and so on, where should I start my designing. Should it be from Usecase or from Class Diagram or from State Chart? which wich approach will help me to put my mind's picture in to design.? Please corporate with me. Thanks.

    Read the article

  • Finding which OS a software requires?

    - by Kannan
    How to find a (ie., Portable single executable) software requires a particular OS (Win98, Win98SE, WinME, Win2000, WinXP, Linux). I am using Win98SE in one pc and WinXP in another PC. If I copy/install a portable software or package in win98se, only after installing / executing that software, that program tell us it requires WinXP,. Is any software to find a particular software needs to run only in win98SE or greater. I tried Dependency Walker by Steve Miller but no results. Kindly help to solve this problem.

    Read the article

  • Windows 2003 GPO Software Restrictions

    - by joeqwerty
    We're running a Terminal Server farm in a Windows 2003 Domain, and I found a problem with the Software Restrictions GPO settings that are being applied to our TS servers. Here are the details of our configuration and the problem: All of our servers (Domain Controllers and Terminal Servers) are running Windows Server 2003 SP2 and both the domain and forest are at Windows 2003 level. Our TS servers are in an OU where we have specific GPO's linked and have inheritance blocked, so only the TS specific GPO's are applied to these TS servers. Our users are all remote and do not have workstations joined to our domain, so we don't use loopback policy processing. We take a "whitelist" approach to allowing users to run applications, so only applications that we approve and add as path or hash rules are able to run. We have the Security Level in Software Restrictions set to Disallowed and Enforcement is set to "All software files except libraries". What I've found is that if I give a user a shortcut to an application, they're able to launch the application even if it's not in the Additional Rules list of "whitelisted" applications. If I give a user a copy of the main executable for the application and they attempt to launch it, they get the expected "this program has been restricted..." message. It appears that the Software Restrictions are indeed working, except for when the user launches an application using a shortcut as opposed to launching the application from the main executable itself, which seems to contradict the purpose of using Software Restrictions. My questions are: Has anyone else seen this behavior? Can anyone else reproduce this behavior? Am I missing something in my understanding of Software Restrictions? Is it likely that I have something misconfigured in Software Restrictions? EDIT To clarify the problem a little bit: No higher level GPO's are being enforced. Running gpresults shows that in fact, only the TS level GPO's are being applied and I can indeed see my Software Restictions being applied. No path wildcards are in use. I'm testing with an application that is at "C:\Program Files\Application\executable.exe" and the application executable is not in any path or hash rule. If the user launches the main application executable directly from the application's folder, the Software Restrictions are enforced. If I give the user a shortcut that points to the application executable at "C:\Program Files\Application\executable.exe" then they are able to launch the program. EDIT Also, LNK files are listed in the Designated File Types, so they should be treated as executable, which should mean that they are bound by the same Software Restrictions settings and rules.

    Read the article

  • SUSE Linux Enterprise Server software

    - by user69333
    Hello, A professor at the university asked me if I could install some software for him on his laptop that runs SLES 11. I'm not familiar with SUSE (I typically work with debian based machines) so I'm having some trouble finding/installing some software. Here's the list of software he needs installed: -xv (plotting software) -xmgrace -LaTeX Can someone point me toward some rpms for the above-mentioned software?

    Read the article

  • Most mind-blowing C++ hack you've ever seen?

    - by sblom
    In the same spirit as the "Hidden features of X?" series, what are the most mind-blowingly well-executed "I didn't even think the language could do that!" hacks that you've ever seen in C++. For example, my recent favorite is an implementation of the "operator" --> for pre-C++0x lambdas. Another fantastic example is Multi-dimensional analog literals. (Note: this is a community wiki question to avoid the appearance of reputation-whoring.)

    Read the article

  • What are some good books on software testing/quality?

    - by mjh2007
    I'm looking for a good book on software quality. It would be helpful if the book covered: The software development process (requirements, design, coding, testing, maintenance) Testing roles (who performs each step in the process) Testing methods (white box and black box) Testing levels (unit testing, integration testing, etc) Testing process (Agile, waterfall, spiral) Testing tools (simulators, fixtures, and reporting software) Testing of embedded systems The goal here is to find an easy to read book that summarizes the best practices for ensuring software quality in an embedded system. It seems most texts cover the testing of application software where it is simpler to generate automated test cases or run a debugger. A book that provided solutions for improving quality in a system where the tests must be performed manually and therefore minimized would be ideal.

    Read the article

  • Torrents: Can I protect my software by sending wrong bytes?

    - by Martijn Courteaux
    Hi, It's a topic that everyone interests. How can I protect my software against stealing, hacking, reverse engineering? I was thinking: Do my best to protect the program for reverse engineering. Then people will crack it and seed it with torrents. Then I download my own cracked software with a torrent with my own torrent-software. My own torrent-software has then to seed incorrect data (bytes). Of course it has to seed critical bytes. So people who want to steal my software download my wrong bytes. Just that bytes that are important to startup, saving and loading data, etc... So if the stealer download from me (and seed it later) can't do anything with it, because it is broken. Is this idea relevant? Maybe, good torrent-clients check hashes from more peers to check if the packages (containing my broken bytes) I want to seed are correct or not? Thanks

    Read the article

  • Self-reference entity in Hibernate

    - by Marco
    Hi guys, I have an Action entity, that can have other Action objects as child in a bidirectional one-to-many relationship. The problem is that Hibernate outputs the following exception: "Repeated column in mapping for collection: DbAction.childs column: actionId" Below the code of the mapping: <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <hibernate-mapping> <class name="DbAction" table="actions"> <id name="actionId" type="short" /> <property not-null="true" name="value" type="string" /> <set name="childs" table="action_action" cascade="all-delete-orphan"> <key column="actionId" /> <many-to-many column="actionId" unique="true" class="DbAction" /> </set> <join table="action_action" inverse="true" optional="false"> <key column="actionId" /> <many-to-one name="parentAction" column="actionId" not-null="true" class="DbAction" /> </join> </class> </hibernate-mapping>

    Read the article

  • SCALE 8x: Free software legal issues

    <b>LWN.net:</b> "But in reality the FLOSS ecosystem relies on a complex legal framework in order to run smoothly and to stand up to proprietary software competition: the various software licenses, contribution agreements, copyright and other "intellectual property" law."

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >