Search Results

Search found 4850 results on 194 pages for 'fluent nhibernate mapping'.

Page 97/194 | < Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata New-SPEnterpriseSearchMetadataCrawledProperty New-SPEnterpriseSearchMetadataManagedProperty Remove-SPEnterpriseSearchMetadataManagedProperty Overview of crawled and managed properties in SharePoint 2013 Preview Remove-SPEnterpriseSearchMetadataManagedProperty SharePoint 2013 – Search Service Application

    Read the article

  • SQL SERVER – Integrate Your Data with Skyvia – Cloud ETL Solution

    - by Pinal Dave
    In our days data integration often becomes a key aspect of business success. For business analysts it’s very important to get integrated data from various sources, such as relational databases, cloud CRMs, etc. to make correct and successful decisions. There are various data integration solutions on market, and today I will tell about one of them – Skyvia. Skyvia is a cloud data integration service, which allows integrating data in cloud CRMs and different relational databases. It is a completely online solution and does not require anything except for a browser. Skyvia provides powerful etl tools for data import, export, replication, and synchronization for SQL Server and other databases and cloud CRMs. You can use Skyvia data import tools to load data from various sources to SQL Server (and SQL Azure). Skyvia supports such cloud CRMs as Salesforce and Microsoft Dynamics CRM and such databases as MySQL and PostgreSQL. You even can migrate data from SQL Server to SQL Server, or from SQL Server to other databases and cloud CRMs. Additionally Skyvia supports import of CSV files, either uploaded manually or stored on cloud file storage services, such as Dropbox, Box, Google Drive, or FTP servers. When data import is not enough, Skyvia offers bidirectional data synchronization. With this tool, you can synchronize SQL Server data with other databases and cloud CRMs. After performing the first synchronization, Skyvia tracks data changes in the synchronized data storages. In SQL Server databases (and other relational databases) it creates additional tracking tables and triggers. This allows synchronizing only the changed data. Skyvia also maps records by their primary key values to each other, so it does not require different sources to have the same primary key structure. It still can match the corresponding records without having to add any additional columns or changing data structure. The only requirement for synchronization is that primary keys must be autogenerated. With Skyvia it’s not necessary for data to have the same structure in integrated data storages. Skyvia supports powerful mapping mechanisms that allow synchronizing data with completely different structure. It provides support for complex mathematical and string expressions when mapping data, using lookups, etc. You may use data splitting – loading data from a single CSV file or source table to multiple related target tables. Or you may load data from several source CSV files or tables to several related target tables. In each case Skyvia preserves data relations. It builds corresponding relations between the target data automatically. When you often work with cloud CRM data, native CRM data reporting and analysis tools may be not enough for you. And there is a vast set of professional data analysis and reporting tools available for SQL Server. With Skyvia you can quickly copy your cloud CRM data to an SQL Server database and apply corresponding SQL Server tools to the data. In such case you can use Skyvia data replication tools. It allows you to quickly copy cloud CRM data to SQL Server or other databases without customizing any mapping. You need just to specify columns to copy data from. Target database tables will be created automatically. Skyvia offers powerful filtering settings to replicate only the records you need. Skyvia also provides capability to export data from SQL Server (including SQL Azure) and other databases and cloud CRMs to CSV files. These files can be either downloadable manually or loaded to cloud file storages or FTP server. You can use export, for example, to backup SQL Azure data to Dropbox. Any data integration operation can be scheduled for automatic execution. Thus, you can automate your SQL Azure data backup or data synchronization – just configure it once, then schedule it, and benefit from automatic data integration with Skyvia. Currently registration and using Skyvia is completely free, so you can try it yourself and find out whether its data migration and integration tools suits for you. Visit this link to register on Skyvia: https://app.skyvia.com/register Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Cloud Computing

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata

    Read the article

  • Barcodes and Bugs

    - by Tim Dexter
    A great mail from Mike at Browning last week. He has been through the ringer getting his BIP barcoding sorted out but he's now out of the woods. Here's the final result. By way of explanation, an excerpt from Mike's email:   This is an example of the GS1_128 carton shipping labels we are now producing with BIP in our web application for our vendors who drop ship products to our dealers. It produces 4 labels per printed page, in PDF format, on peel & stick label paper. Each label has a unique carton number, and a unique carton serial number in the SSCC-18 barcode. This example is for Cabelas (each customer has slightly different GS1-128 label format requirements – custom template for each - a pain!). I am using custom java encoders I wrote for the UPC and SSCC-18 barcodes, and a standard encoder (code128b) for the ShipTo zip barcode. Is there any way yet to get around that SUPER ANNOYING bug when opening the rtf template in MS Word, and it replaces my xsl code text in the barcode fields with gibberish??? Every time I open it I have to re-enter all the xsl code. Not only to be able to read & edit it, but also to get it to work in BIP (BIP doesn’t like the gibberish if I upload the template that has it). Mike's last point, regarding the annoying bug in the template builder, is one that I have experienced occasionally. The development team have looked at it and found it to be an issue with MSWord and not a plugin problem. That's all well and good but how can you get around it? Well, you can take advantage of the font mapping that BIP offers to get the barcodes into the PDF output. As many of you know, getting a barcode font to appear in the PDF output, you need employ the use of the xdo.cfg file in the template builder config directory.You would normally have an entry such as this:         <font family="Code 128" style="normal" weight="normal">        <truetype path="C:\windows\fonts\128R00.TTF" />       </font>to map a barcode font to get it to render in the PDF output when testing from the template builder plugin.   Mike's issue is only present when the formfield is highlighted with a barcode font. The other fields in the template are OK. What you can do to get around the issue is to bend the config entry to get around having to use the barcode font in the template at all. Changing the entry to something like:         <font family="Calibri" style="normal" weight="normal">        <truetype path="C:\windows\fonts\128R00.TTF" />       </font>   Note that we are mapping the Calibri; a humanly readable and non 'erroring' font in the template, to the code 128 barcode font. Where you used to highlight the field with the barcode in MSWord, you now use the Calibri font instead. At run time, BIP will go look for the Calibri font mapping and will drop in the Code128 font. Of course, Calibri is an example; you need to pick a font that you are not going to use any where else in the layout.

    Read the article

  • Core Data migration problem: "Persistent store migration failed, missing source managed object model

    - by John Gallagher
    The Background A Cocoa Non Document Core Data project with two Managed Object Models. Model 1 stays the same. Model 2 has changed, so I want to migrate the store. I've created a new version by Design Data Model Add Model Version in Xcode. The difference between versions is a single relationship that's been changed from to a one to many. I've made my changes to the model, then saved. I've made a new Mapping Model that has the old model as a source and new model as a destination. I've ensured all Mapping Models and Data Models and are being compiled and all are copied to the Resource folder of my app bundle. I've switched on migrations by passing in a dictionary with the NSMigratePersistentStoresAutomaticallyOption key as [NSNumber numberWithBool:YES] when adding the Persistent Store. Rather than merging all models in the bundle, I've specified the two models I want to use (model 1 and the new version of model 2) and merged them using modelByMergingModels: The Problem No matter what I do to migrate, I get the error message: "Persistent store migration failed, missing source managed object model." What I've Tried I clean after every single build. I've tried various combinations of having only the model I'm migrating to in Resources, being compiled, or both. Since the error message implies it can't find the source model for my migration, I've tried having every version of the model in both the Resources folder and being compiled. I've made sure I'm not making a really basic error by switching back to the original version of my data model. The app runs fine. I've deleted the Mapping Model and the new version of the model, cleaned, then recreated both. I've tried making a different change in the new model - deleting an entity instead. I'm at my wits end. I can't help but think I've made a huge mistake somewhere that I'm not seeing. Any ideas?

    Read the article

  • What causes this org.hibernate.MappingException?

    - by stacker
    I'm trying to configure an ejb3 sample application, it's entities where mapped to postgres now I want the app run on Jboss4.3 and Informix using JPA. If the DDL creation <property name="hibernate.hbm2ddl.auto" value="create"/> is active this error appears > WARN [ServiceController] Problem > starting service > persistence.units:ear=weblog.ear,jar=weblog.jar,unitName=weblog > javax.persistence.PersistenceException: > [PersistenceUnit: weblog] Unable to > build EntityManagerFactory > at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:677) > at org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:132) > at org.jboss.ejb3.entity.PersistenceUnitDeployment.start(PersistenceUnitDeployment.java:246) followed by Caused by: org.hibernate.MappingException: No Dialect mapping for JDBC type: 2005 at org.hibernate.dialect.TypeNames.get(TypeNames.java:56) at org.hibernate.dialect.TypeNames.get(TypeNames.java:81) at org.hibernate.dialect.Dialect.getTypeName(Dialect.java:291) at org.hibernate.mapping.Column.getSqlType(Column.java:182) at org.hibernate.mapping.Table.sqlCreateString(Table.java:394) at org.hibernate.cfg.Configuration.generateSchemaCreationScript(Configuration.java:854) at org.hibernate.tool.hbm2ddl.SchemaExport.<init>(SchemaExport.java:74) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:311) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1300) at org.hibernate.cfg.AnnotationConfiguration.buildSessionFactory(AnnotationConfiguration.java:874) at org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:669) What does JDBC type: 2005 mean? Any idea how I can track down the entity/column causes the problem? Thanks

    Read the article

  • java.sql.Exception ClosedConnection

    - by john
    I am getting the following error: java.sql.SQLException: Closed Connection at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:146) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:208) at oracle.jdbc.driver.PhysicalConnection.getMetaData(PhysicalConnection.java:1508) at com.ibatis.sqlmap.engine.execution.SqlExecutor.moveToNextResultsSafely(SqlExecutor.java:348) at com.ibatis.sqlmap.engine.execution.SqlExecutor.handleMultipleResults(SqlExecutor.java:320) at com.ibatis.sqlmap.engine.execution.SqlExecutor.executeQueryProcedure(SqlExecutor.java:277) at com.ibatis.sqlmap.engine.mapping.statement.ProcedureStatement.sqlExecuteQuery(ProcedureStatement.java:34) at com.ibatis.sqlmap.engine.mapping.statement.GeneralStatement.executeQueryWithCallback(GeneralStatement.java:173) at com.ibatis.sqlmap.engine.mapping.statement.GeneralStatement.executeQueryForList(GeneralStatement.java:123) at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.queryForList(SqlMapExecutorDelegate.java:614) at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.queryForList(SqlMapExecutorDelegate.java:588) at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.queryForList(SqlMapSessionImpl.java:118) at org.springframework.orm.ibatis.SqlMapClientTemplate$3.doInSqlMapClient(SqlMapClientTemplate.java:268) at org.springframework.orm.ibatis.SqlMapClientTemplate.execute(SqlMapClientTemplate.java:193) at org.springframework.orm.ibatis.SqlMapClientTemplate.executeWithListResult(SqlMapClientTemplate.java:219) at org.springframework.orm.ibatis.SqlMapClientTemplate.queryForList(SqlMapClientTemplate.java:266) at gov.hud.pih.eiv.web.authentication.AuthenticationUserDAO.isPihUserDAO(AuthenticationUserDAO.java:24) at gov.hud.pih.eiv.web.authorization.AuthorizationProxy.isAuthorized(AuthorizationProxy.java:125) at gov.hud.pih.eiv.web.authorization.AuthorizationFilter.doFilter(AuthorizationFilter.java:224) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:246) at I am really stumped and can't figure out what could be causing this error. I am not able to reproduce the error on my machine but on production it is coming a lot of times. I am using iBatis in the whole application so there are no chances of my code not closing connections. We do have stored procedures that run for a long time before they return results (around 15 seconds). does anyone have any ideas on what could be causing this? I dont think raising the # of connections on the application server will fix this issue buecause if connections were running out then we'd see "Error on allocating connections"

    Read the article

  • Where should 'CreateMap' statements go?

    - by jonathanconway
    I frequently use AutoMapper to map Model (Domain) objects to ViewModel objects, which are then consumed by my Views, in a Model/View/View-Model pattern. This involves many 'Mapper.CreateMap' statements, which all must be executed, but must only be executed once in the lifecycle of the application. Technically, then, I should keep them all in a static method somewhere, which gets called from my Application_Start() method (this is an ASP.NET MVC application). However, it seems wrong to group a lot of different mapping concerns together in one central location. Especially when mapping code gets complex and involves formatting and other logic. Is there a better way to organize the mapping code so that it's kept close to the ViewModel that it concerns? (I came up with one idea - having a 'CreateMappings' method on each ViewModel, and in the BaseViewModel, calling this method on instantiation. However, since the method should only be called once in the application lifecycle, it needs some additional logic to cache a list of ViewModel types for which the CreateMappings method has been called, and then only call it when necessary, for ViewModels that aren't in that list.)

    Read the article

  • Implementing Struts 2 Interceptors using Struts 1

    - by Andriy Zakharchuk
    Hello all, I have a legacy application written with Struts 1. The only feature I was asked to add is to protect some actions. Currently any user can do whatever he/she wants. The idea is to allows all user see the data, but block modification operation, i.e. to modify data a user should log in. I know Struts2 has interceptors, so I could attach them to required actions and forward users to log in page when needed. But how can I do similar thing in Struts 1 application? My first idea was to create my own abstract Action class: public class AuthenticatedAction { public ActionForward execute( ActionMapping mapping, ActionForm form, HttpServletRequest theRequest, HttpServletResponse theResponse) { if (!logged) { // forward to log in form } else { doExecute(mapping, form, request, response); } } public abstract ActionForward doExecute( ActionMapping mapping, ActionForm form, HttpServletRequest theRequest, HttpServletResponse theResponse); } Then change all actions that require authentication from extends Action to extends AuthenticatedAction then add login form, login action (which performs authentications and puts this status into the session) and change JSP header tile to display authentication block, e.g., "You are (not logged in)/", Login/Logout. As I guess this should solve the problem. If this doesn't solve the problem, please explain me why. Is there any better (more elegant like interceptors are) way to do this? Thank you in advance.

    Read the article

  • Entity Framework Multiple associations to a table causes error 3033

    - by taylonr
    I'm using EF 3.5 SP1. I have 3 tables: Pendants PendantAccessories PartsData Basically #1 and 2 are used for product selection, so #1 has a "Number of Buttons" property and other options. #2 has fields like "Cable Type" etc. The third table contains property information for all of our parts, such as what plant it's manufactured in, it's weight etc. What I'm trying to do is set up an association between #1 and #3 and also between #2 and #3. The PK in all 3 tables is the PartNumber. I set it up between #2 and 3 by going into Mapping Details and adding a Maps to PartsData and mapping the columns. Everything worked good. I then tried the same thing between #1 and 3. However, now when I compile I get "Error 3033: Problem in Mapping Fragment starting at line 713: EntitySets 'pendants' and 'pendantAccessories' are both mapped to the table 'PartsData'. Their Primary Keys may collide." Does anyone know what I'm doing wrong here?

    Read the article

  • Open a buffer as a vertical split in VIM

    - by alfredodeza
    If you are editing a file in VIM and then you need to open an existing buffer (e.g. from your buffer list: :buffers) how can you open it in a vertical split? I know that you already can open it with a normal split like: :sbuffer N Wehere N is the buffer number you want, however, the above opens that N buffer horizontally, not vertically. I'm also aware that you can change the window placement after opening and have a Vertical Split like so: Ctrl-W H Ctrl-W L Which will vertically split the window to the right or the left. It seems to me that if there is a sbuffer there should be a vsbuffer but that doesn't exist (not that I am aware of) Also, please note that I am not looking for a plugin to solve this question. I know about a wealth of plugins that will allow you to do this. I am sure I might be missing something that is already there. EDIT: In the best spirit of collaboration, I have created a simple Function with a Mapping if someone else stumbles across this issue and do not want to install a plugin: Function: " Vertical Split Buffer Function function VerticalSplitBuffer(buffer) execute "vert belowright sb" a:buffer endfunction Mapping: " Vertical Split Buffer Mapping command -nargs=1 Vbuffer call VerticalSplitBuffer(<f-args>) This accomplishes the task of opening a buffer in a right split, so for buffer 1, you would call it like: :Vbuffer 1

    Read the article

  • Struts ActionError

    - by user287663
    Hi all. Anyone knows why the code below doesn't compile? The reason is that it could not find symbol for ActionError. Thanks in advance. package com.hbs; import javax.servlet.RequestDispatcher; import javax.servlet.ServletException; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpSession; import javax.servlet.http.HttpServletResponse; import org.apache.struts.action.Action; import org.apache.struts.action.ActionError; import org.apache.struts.action.ActionErrors; import org.apache.struts.action.ActionForm; import org.apache.struts.action.ActionMapping; import org.apache.struts.action.ActionForward; import org.apache.struts.util.MessageResources; import org.apache.commons.beanutils.PropertyUtils; public class FeedbackAction extends org.apache.struts.action.Action { private final static String SUCCESS = "success"; public ActionForward execute(ActionMapping mapping, ActionForm form, HttpServletRequest request, HttpServletResponse response) throws Exception { ActionErrors errors = new ActionErrors(); String fullName = (String)PropertyUtils.getSimpleProperty(form, "fullName"); String fullName1 = ""; if(fullName.equals(fullName1)) { errors.add("fullName", new ActionError("error.fullName", fullName)); saveErrors(request,errors); return (new ActionForward(mapping.getInput())); } return mapping.findForward(SUCCESS); } }

    Read the article

  • Spring-MVC Problem using @Controller on controller implementing an interface

    - by layne
    I'm using spring 2.5 and annotations to configure my spring-mvc web context. Unfortunately, I am unable to get the following to work. I'm not sure if this is a bug (seems like it) or if there is a basic misunderstanding on how the annotations and interface implementation subclassing works. For example, @Controller @RequestMapping("url-mapping-here") public class Foo { @RequestMapping(method=RequestMethod.GET) public void showForm() { ... } @RequestMapping(method=RequestMethod.POST) public String processForm() { ... } } works fine. When the context starts up, the urls this handler deals with are discovered, and everything works great. This however does not: @Controller @RequestMapping("url-mapping-here") public class Foo implements Bar { @RequestMapping(method=RequestMethod.GET) public void showForm() { ... } @RequestMapping(method=RequestMethod.POST) public String processForm() { ... } } When I try to pull up the url, I get the following nasty stack trace: javax.servlet.ServletException: No adapter for handler [com.shaneleopard.web.controller.RegistrationController@e973e3]: Does your handler implement a supported interface like Controller? org.springframework.web.servlet.DispatcherServlet.getHandlerAdapter(DispatcherServlet.java:1091) org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:874) org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:809) org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:571) org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:501) javax.servlet.http.HttpServlet.service(HttpServlet.java:627) However, if I change Bar to be an abstract superclass and have Foo extend it, then it works again. @Controller @RequestMapping("url-mapping-here") public class Foo extends Bar { @RequestMapping(method=RequestMethod.GET) public void showForm() { ... } @RequestMapping(method=RequestMethod.POST) public String processForm() { ... } } This seems like a bug. The @Controller annotation should be sufficient to mark this as a controller, and I should be able to implement one or more interfaces in my controller without having to do anything else. Any ideas?

    Read the article

  • Java webapp: how to implement a web bug (1x1 pixel)?

    - by NoozNooz42
    In the accepted answer in the following question, a SO regular with 13K+ rep suggests to use a "web bug" (non-cacheable 1x1 img) to be able to track requests in the logs: http://stackoverflow.com/questions/1784893 How can I do this in Java? Basically, I've got two issues: how to make sure the 1x1 image is not cacheable (how to set the header)? how to make sure the query for these 1x1 image will appear in the logs? I'm looking for exact piece of code because I know how to write a .jsp/servlet and I know how to serve an 1x1 image :) My question is really about the exact .jsp/servlet that I should write and how/what needs to be done so that Tomcat logs the request. For example I plan to use the following mapping: <servlet-mapping> <servlet-name>WebBugServlet</servlet-name> <url-pattern>/webbug*</url-pattern> </servlet-mapping> and then use an img tag referencing a "webbug.png" (or .gif), so how do I write the .jsp/servlet? What/where should I look for in the logs?

    Read the article

  • How good is the memory mapped Circular Buffer on Wikipedia?

    - by abroun
    I'm trying to implement a circular buffer in C, and have come across this example on Wikipedia. It looks as if it would provide a really nice interface for anyone reading from the buffer, as reads which wrap around from the end to the beginning of the buffer are handled automatically. So all reads are contiguous. However, I'm a bit unsure about using it straight away as I don't really have much experience with memory mapping or virtual memory and I'm not sure that I fully understand what it's doing. What I think I understand is that it's mapping a shared memory file the size of the buffer into memory twice. Then, whenever data is written into the buffer it appears in memory in 2 places at once. This allows all reads to be contiguous. What would be really great is if someone with more experience of POSIX memory mapping could have a quick look at the code and tell me if the underlying mechanism used is really that efficient. Am I right in thinking for example that the file in /dev/shm used for the shared memory always stays in RAM or could it get written to the hard drive (performance hit) at some point? Are there any gotchas I should be aware of? As it stands, I'm probably going to use a simpler method for my current project, but it'd be good to understand this to have it in my toolbox for the future. Thanks in advance for your time.

    Read the article

  • Does Hibernate support one-to-one associations as pkeys?

    - by Andrzej Doyle
    Hi all, Can anyone tell me whether Hibernate supports associations as the pkey of an entity? I thought that this would be supported but I am having a lot of trouble getting any kind of mapping that represents this to work. In particular, with the straight mapping below: @Entity public class EntityBar { @Id @OneToOne(optional = false, mappedBy = "bar") EntityFoo foo // other stuff } I get an org.hibernate.MappingException: "Could not determine type for: EntityFoo, at table: ENTITY_BAR, for columns: [org.hibernate.mapping.Column(foo)]" Diving into the code it seems the ID is always considered a Value type; i.e. "anything that is persisted by value, instead of by reference. It is essentially a Hibernate Type, together with zero or more columns." I could make my EntityFoo a value type by declaring it serializable, but I wouldn't expect this would lead to the right outcome either. I would have thought that Hibernate would consider the type of the column to be integer (or whatever the actual type of the parent's ID is), just like it would with a normal one-to-one link, but this doesn't appear to kick in when I also declare it an ID. Am I going beyond what is possible by trying to combine @OneToOne with @Id? And if so, how could one model this relationship sensibly?

    Read the article

  • How do I correcly handle ZoneLocalMapping.ResultType.Ambiguous?

    - by RWC
    In my code I try to handle ZoneLocalMapping.ResultType.Ambiguous. The line unambiguousLocalDateTime = localDateTimeMapping.EarlierMapping; throws an InvalidOperationException with message "EarlierMapping property should not be called on a result of type Ambiguous". I have no clue how I should handle it. Can you give me an example? This is what my code looks like: public Instant getInstant(int year, int month, int day, int hour, int minute) { var localDateTime = new LocalDateTime(year, month, day, hour, minute); //invalidated, might be not existing var timezone = DateTimeZone.ForId(TimeZoneId); //TimeZone is set elsewhere, example "Brazil/East" var localDateTimeMapping = timezone.MapLocalDateTime(localDateTime); ZonedDateTime unambiguousLocalDateTime; switch (localDateTimeMapping.Type) { case ZoneLocalMapping.ResultType.Unambiguous: unambiguousLocalDateTime = localDateTimeMapping.UnambiguousMapping; break; case ZoneLocalMapping.ResultType.Ambiguous: unambiguousLocalDateTime = localDateTimeMapping.EarlierMapping; break; case ZoneLocalMapping.ResultType.Skipped: unambiguousLocalDateTime = new ZonedDateTime(localDateTimeMapping.ZoneIntervalAfterTransition.Start, timezone); break; default: throw new InvalidOperationException(string.Format("Unexpected mapping result type: {0}", localDateTimeMapping.Type)); } return unambiguousLocalDateTime.ToInstant(); } If I look at class ZoneLocalMapping I see the following code: /// <summary> /// In an ambiguous mapping, returns the earlier of the two ZonedDateTimes which map to the original LocalDateTime. /// </summary> /// <exception cref="InvalidOperationException">The mapping isn't ambiguous.</exception> public virtual ZonedDateTime EarlierMapping { get { throw new InvalidOperationException("EarlierMapping property should not be called on a result of type " + type); } } That's why I am receiving the exception, but what should I do to get the EarlierMapping?

    Read the article

  • Robotlegs: Warning: Injector already has a rule for type

    - by MikeW
    I have a bunch of warning messages like this appear when using Robotlegs/Signals. Everytime this command class executes, which is every 2-3 seconds ..this message displays below If you have overwritten this mapping intentionally you can use "injector.unmap()" prior to your replacement mapping in order to avoid seeing this message. Warning: Injector already has a rule for type "mx.messaging.messages::IMessage", named "". The command functions fine otherwise but I think I'm doing something wrong anyhow. public class MessageReceivedCommand extends SignalCommand { [Inject] public var message:IMessage; ...etc.. do something with message.. } the application context doesnt map IMessage to this command, as I only see an option to mapSignalClass , besides the payload is received fine. Wonder if anyone knows how I might either fix or suppress this message. I've tried calling this as the warning suggests injector.unmap(IMessage, "") but I receive an error - no mapping found for ::IMessage named "". Thanks Edit: A bit more info about the error Here is the signal that I dispatch to the command public class GameMessageSignal extends Signal { public function GameMessageSignal() { super(IMessage); } } which is dispatched from a IPushDataService class gameMessage.dispatch(message.message); and the implementation is wired up in the app context via injector.mapClass(IPushDataService, PushDataService); along with the signal signalCommandMap.mapSignalClass(GameMessageSignal, MessageReceivedCommand); Edit #2: Probably good to point out also I inject an instance of GameMessageSignal into IPushDataService public class PushDataService extends BaseDataService implements IPushDataService { [Inject] public var gameMessage:GameMessageSignal; //then private function processMessage(message:MessageEvent):void { gameMessage.dispatch(message.message); } } Edit:3 The mappings i set up in the SignalContext: injector.mapSingleton(IPushDataService); injector.mapClass(IPushDataService, PushDataService);

    Read the article

  • Can't Add XSD to Class Project

    - by Jeff
    Background: I started with a large solution with many applications in it in VS 2008 and I'm trying to split it up. Steps to repeat: I create a new VS 2010 C# class project I right click and choose add existing item I choose the XSD file from my old project and import it. The original file is 67KB the imported file is 18KB Line 134,135 of the original file <Mapping SourceColumn="ConfigType" DataSetColumn="ConfigType" /> <Mapping SourceColumn="ConfigValue" DataSetColumn="ConfigValue" /> Line 135,136 of the resulting file <Mapping SourceColumn="ConfigType" DataSetColumn="ConfigType" /> <Mappi Part way through it's life my old project was upgrade from 2.0 to 3.5 so some of the code is. Manually copy and paste of the xsd source into the new file and updating the 2.0.0.0 to 4.0.0.0 allowed me to open it it in The GUI for editing XSD files. After fixing all the connection strings and right clicking on every query and clicking configure then finish I was able to gain access to one of the tableadapters out of 6. I'm stumped as to hoe to get this to compile. Once it compiles I'm open sourcing it so ask if you want to see the code.

    Read the article

  • Grep without storing search to the "/ register in Vim

    - by Phro
    In my .vimrc I have a mapping that makes a line of text 'title capitalized': noremap <Leader>at :s/\v<(.)(\w{2,})/\u\1\L\2/g<CR> However, whenever I run this function, it highlights every word that is at least three characters long in my entire document. Of course I could get this behaviour to stop simply by appending :nohlsearch<CR> to the end of the mapping, but this is more of an awkward hack that still avoids a bigger problem: The last search has been replaced by \v<(.)(\w{2,}). Is there any way to use the search commands in Vim without storing the last search in the "/ register; a 'silent' search of sorts? That way, after running this title-making command, I can still use my previous search to navigate the document using n, N, etc. Edit Using @brettanomyces' answer, I found that simply setting the mapping: noremap <Leader>at :call setline(line('.'),substitute(getline('.'), '\v<(.)(\w{2,})', '\u\1\L\2', 'g'))<CR> will successfully perform the substitution without storing the searched text into the / register.

    Read the article

  • CodePlex Daily Summary for Monday, February 22, 2010

    CodePlex Daily Summary for Monday, February 22, 2010New ProjectsAVDB: System to keep track of orders and the inventory of televisions, DVDs, VCRs etcBooky: Booky is an online Bookmark Management Tool. Gear Up for Lord of the Rings Online (lotro): Windows utility for checking what your LOTRO character currently has equipped and figuring out gear you should get to improve your stats.GotSharp Extensions: GotSharp Extensions is a set of helpful classes and extension methods that can make your coding experience easier and cleaner. Halfwit: A minimalist WPF Twitter client.HOA Starter Kit: A community subdivision website starter kit. First draft.Lua For Irony: Project to define the Lua language using the Irony (http://irony.codeplex.com/) development kit. This work is based heavily on the work done for V...MimeCloud: Scalable .NET Digital Asset & Media Management: MimeCloud is a scalable digital asset library & media management toolset. Founded by Alex Norcliffe and Peter Miller Written by people who have b...Parallel Mandelbrot Set solver: Solving the Mandelbrot set using the Parallel class in .NET 4.0. Showing the resulting image in a WPF application. The solution file requires VS 2010.Pomogad - Pomodoro Windows Gadget: Você usa Pomodoro Technique? Não sabe o que é? Veja aqui http://www.pomodorotechnique.com Agora que você já sabe, que tal usar essa técnica? E p...PostCrap - flyweight .NET AOP post compiler: PostCrap is a flyweight attribute based aspect injection .NET post compiler It is written in C# and uses Mono.Cecil to modify assemblies and injec...Software + Service Reference Demo Kit: MS China Developer and Platform Evangelism team created an End-2-End demo for Software + Service. Yet Another SharePoint Tool: YEAST provides you with a simple to integrate approach to generating SharePoint solution packages as part of a Visual Studio project. Zen Coding Visual Studio Plugin: Zen Coding for Visual Studio is plugin for HTML and CSS hi-speed codingNew Releases.Net MSBuild Google Closure Compiler Task: .Net MSBuild Google Closure Compiler Task 1.1: - Corrected issue with regular expression source file and renamingdotNails: dotNails_0.5.9: NOTE - the latest source code has been moved to google code to take advantage of Mercurial source control - http://code.google.com/p/dotnails/sourc...EasyWFUnit: EasyWFUnit-2.2: Release 2.2 of EasyWFUnit, an extension library to support unit testing of Windows Workflow, includes a revised WinForm GUI Test Builder that utili...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite BETA2 (for .NET 4.0RC): Includes Fluent.dll (with .pdb and .xml) and test application compiled with .NET 4.0 RC.FolderSize: FolderSize.Win32.1.0.3.0: FolderSize.Win32.1.0.3.0 A simple utility intended to be used to scan harddrives for the folders that take most place and display this to the user...Fusion Charts Free for SharePoint: 1.3: Fix release for issue #11833 : Feature Must Be Activated on Root of Web Application.GotSharp Extensions: 1.0: First release, containing only a few extension methods for the System.String and System.IO.Stream classes, and a Range utility class.Jeremy's Experimental Repository: FluentValidation with IoC Sample: Sample code for the blog post Using FluentValidation with an IoC containerMiniTwitter: 1.08: MiniTwitter 1.08 更新内容 修正 自動更新が CodePlex の変更で動いていなかった問題を修正 自動更新に失敗すると落ちるバグを修正 通知領域アイコン右クリックで表示されるメニューが消えないバグを修正 変更 ハッシュタグの抽出条件を変更 API のエンドポイ...MSTS Editors & Tools: Simis Editor v0.3: Simis Editor v0.3 Enabled Edit > Undo and Edit > Redo. Undoing/redoing back to last saved state is identified as saved (no prompt on exit, etc.)....Parallel Mandelbrot Set solver: Alpha 1: First releaseParallelTasks: ParallelTasks 2.0 beta1: ParallelTasks 2.0 is a total re-write of the original version. Featuring improved performance and stability and a more consistent API.Personal Expense Tracker: Personal Expense Tracker v0.1 beta: This is the first beta release. Please provide me with your feedback.PostCrap - flyweight .NET AOP post compiler: PostCrap 1.0 AOP source and binaries: PostCrap 1.0 source and binaries (the unit test project contains sample interceptor attributes for exception handling & logging)Protoforma | Tactica Adversa: Skilful 0.1.3.276: AlphaRawr: Rawr 2.3.10: - More improvements to the default filters - Further improvement on avoiding useless gem swaps from the Optimizer. - Normal/Heroic ICC items shou...Reusable Library: v1.0.2: A collection of reusable abstractions for enterprise application developer.Sem.Sync: 2010-02-21 - Synchronization Manager - Beta: This release is not tested very well, so you should use this version only to evaluate new features. - Changed way of handling source-ids in order ...Survey - web survey & form engine: Survey 1.1.0: Release Survey v. 1.1.0.0 Major changes: - layout & graphics completely overhauled - several technical changes & repairs (e.g. matrix question iss...Yet Another SharePoint Tool: Version 1: Version 1Zeta Resource Editor: Release 2010-02-21: New source code release.Most Popular ProjectsWBFS ManagerRawrAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)Image Resizer Powertoy Clone for WindowsASP.NETDotNetNuke® Community EditionMicrosoft SQL Server Community & SamplesMost Active ProjectsDinnerNow.netRawrBlogEngine.NETNB_Store - Free DotNetNuke Ecommerce Catalog ModuleSharpyjQuery Library for SharePoint Web ServicesSharePoint ContribInfoServicepatterns & practices – Enterprise LibraryPHPExcel

    Read the article

  • What kind of specific projects can I do to master bitwise operations in C++? Also is there a canonical book? [closed]

    - by Ford
    I don't use C++ or bitwise operations at my current job but I'm thinking of applying to companies where it is a requirement to be fluent with them (on their tests anyway). So my question is: Can anyone suggest a project which will require gaining a fluency in bitwise operations to complete? On a side note, is there a canonical book on optimization techniques using bitwise operations since that seems to be an important use of them?

    Read the article

  • VB like with keywords in C#

    - by Tanzim Saqib
    AspectF is an open source utility which offers separation of concerns in fluent way. I am personally a big fan as well as contributor of this project. It is very simple, easy to implement, and an excellent way to incorporate regular everyday logics into your business code from one single class, AspectF. I have added couple of new features to it, which are yet to be committed to the source control. However, here’s one feature that I have introduced today is to be able to write VB-like with keyword...(read more)

    Read the article

  • Hiring MySQL Curriculum Developer

    - by Antoinette O'Sullivan
    If you want to be part of the team that creates the Official Oracle Training on MySQL and meet the following criteria: Experience of Course Design and Development Experience of database such as MySQL Fluent in English - written and spoken Keen to keep on learning Then this is the opportunity for you! Learn more about our open position for MySQL Curriculum Developer here.

    Read the article

  • Using NBuilder to mock up a data driven UI - Part 1

    In this article we will take a look at a fairly new open source project called NBuilder (http://www.nbuilder.org and http://code.google.com/p/nbuilder/) and how it can be used to provide us with fake data out of the gate. NBuilder allows you to quickly stand up generated objects based on standard .net types in an easy fluent manner. And that is just the start!

    Read the article

< Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >