Search Results

Search found 2185 results on 88 pages for 'n way merge'.

Page 38/88 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • JPA returning null for deleted items from a set

    - by Jon
    This may be related to my question from a few days ago, but I'm not even sure how to explain this part. (It's an entirely different parent-child relationship.) In my interface, I have a set of attributes (Attribute) and valid values (ValidValue) for each one in a one-to-many relationship. In the Spring MVC frontend, I have a page for an administrator to edit these values. Once it's submitted, if any of these fields (as <input> tags) are blank, I remove the ValidValue object like so: Set<ValidValue> existingValues = new HashSet<ValidValue>(attribute.getValidValues()); Set<ValidValue> finalValues = new HashSet<ValidValue>(); for(ValidValue validValue : attribute.getValidValues()) { if(!validValue.getValue().isEmpty()) { finalValues.add(validValue); } } existingValues.removeAll(finalValues); for(ValidValue removedValue : existingValues) { getApplicationDataService().removeValidValue(removedValue); } attribute.setValidValues(finalValues); getApplicationDataService().modifyAttribute(attribute); The problem is that while the database is updated appropriately, the next time I query for the Attribute objects, they're returned with an extra entry in their ValidValue set -- a null, and thus, the next time I iterate through the values to display, it shows an extra blank value in the middle. I've confirmed that this happens at the point of a merge or find, at the point of "Execute query ReadObjectQuery(entity.Attribute). Here's the code I'm using to modify the database (in the ApplicationDataService): public void modifyAttribute(Attribute attribute) { getJpaTemplate().merge(attribute); } public void removeValidValue(ValidValue removedValue) { ValidValue merged = getJpaTemplate().merge(removedValue); getJpaTemplate().remove(merged); } Here are the relevant parts of the entity classes: Entity @Table(name = "attribute") public class Attribute { @OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "attribute") private Set<ValidValue> validValues = new HashSet<ValidValue>(0); } @Entity @Table(name = "valid_value") public class ValidValue { @ManyToOne(fetch = FetchType.LAZY) @JoinColumn(name = "attr_id", nullable = false) private Attribute attribute; }

    Read the article

  • Is this a situation where I should "hg push -f"?

    - by user144182
    I have two machines, A and B that both access an external hg repository. I did some development on A, wasn't ready to push changesets to the external, and needed to switch machines, so I pushed the changesets to B using hg serve. Changesets continued on B, were committed and then pushed to external repo. I then pulled on A and updated to default/tip. This left the local changesets that had previously been pushed to B as a branch, but because of how I pushed things around, the changes in the local changesets are already in default/tip. I've now continued to make changes and commit locally on A, but when I try to push hg asks me to merge or do push -f instead. I know push -f is almost never recommended. This situation is close to one where I should use rebase, however the changesets that would be "rebased" I don't really need locally or in the external repository since they are already effectively in default/tip via the push to B. Now, I know I could merge with the latest local changeset and just discard the changes, but then I would still have to commit the merge which gets me back into rebase territory. Is this a case where I could do hg push -f? Also, why would pushing from A create remote heads if I've updated to default/tip before I continued to commit changesets?

    Read the article

  • Git Svn dcommit error - restart the commit

    - by Rob Wilkerson
    Last week, I made a number of changes to my local branch before leaving town for the weekend. This morning I wanted to dcommit all of those changes to the company's Svn repository, but I get a merge conflict in one file: Merge conflict during commit: Your file or directory 'build.properties.sample' is probably out-of-date: The version resource does not correspond to the resource within the transaction. Either the requested version resource is out of date (needs to be updated), or the requested version resource is newer than the transaction root (restart the commit). I'm not sure exactly why I'm getting this, but before attempting to dcommit, I did a git svn rebase. That "overwrote" my commits. To recover from that, I did a git reset --hard HEAD@{1}. Now my working copy seems to be where I expect it to be, but I have no idea how to get past the merge conflict; there's not actually any conflict to resolve that I can find. Any thoughts would be appreciated. EDIT: Just wanted to specify that I am working locally. I have a local branch for the trunk that references svn/trunk (the remote branch). All of my work was done on the local trunk: $ git branch maint-1.0.x master * trunk $ git branch -r svn/maintenance/my-project-1.0.0 svn/trunk Similarly, git log currently shows 10 commits on my local trunk since the last commit with a Svn ID. Hopefully that answers a few questions. Thanks again.

    Read the article

  • Can I keep git from pushing the master branch to all remotes by default?

    - by Curtis
    I have a local git repository with two remotes ('origin' is for internal development, and 'other' is for an external contractor to use). The master branch in my local repository tracks the master in 'origin', which is correct. I also have a branch 'external' which tracks the master in 'other'. The problem I have now is that my master brach ALSO wants to push to the master in 'other' as well, which is an issue. Is there any way I can specify that the local master should NOT push to other/master? I've already tried updating my .git/config file to include: [branch "master"] remote = origin merge = refs/heads/master [branch "external"] remote = other merge = refs/heads/master [push] default = upstream But remote show still shows that my master is pushing to both remotes: toko:engine cmlacy$ git remote show origin Password: * remote origin Fetch URL: <REPO LOCATION> Push URL: <REPO LOCATION> HEAD branch: master Remote branches: master tracked refresh-hook tracked Local branch configured for 'git pull': master merges with remote master Local ref configured for 'git push': master pushes to master (up to date) Those are all correct. toko:engine cmlacy$ git remote show other Password: * remote other Fetch URL: <REPO LOCATION> Push URL: <REPO LOCATION> HEAD branch: master Remote branch: master tracked Local branch configured for 'git pull': external merges with remote master Local ref configured for 'git push': master pushes to master (local out of date) That last section is the problem. 'external' should merge with other/master, but master should NEVER push to other/master. It's never gong to work.

    Read the article

  • A question of long-running and disruptive branches

    - by Matt Enright
    We are about to begin prototyping a new application that will share some existing infrastructure assemblies with an existing application, and also involve a significant subset of the existing domain model. Parts of the domain model will likely undergo some serious changes for this new application, and the endgame for all of this, once the new application has been fully specified and is launch-ready is that we would like to re-unify the models of the two applications (as well as share a database, link functionality, etc.), but for the duration of development, prototyping, etc, we will be using a separate database so that we can change things without worrying about impact to development or use of the existing application. Since it is a prototype, there will be a pretty long window during which serious changes or rearchitecturing can occur as product management experiments with different workflows, different customer bases are surveyed, and we try and keep up. We have already made a Subversion branch, so as to not impact concurrent development on the mature application, and are toying with 2 potential ways of moving forward with this: Use the svn branch as the sole mechanism of separation. Make our changes to the existing domain models, and evaluate their impact on the existing application (and make requisite changes to ProjectA) when we have established that our long-running side branch is stable enough for re-entry to trunk. "Fork" the shared code (temporarily): Copy ProjectA.Entities to NewProject.Entities, and treat all of the NewProject code as self-contained. When all of the perturbations around the model have died down and we feel satisfied, manually re-integrate the changes (as granular or sweeping as warranted) back into ProjectA.Entities, updating ProjectA to use the improved models at each step (this can take place either before or after the subversion merge has occurred). The subversion merge will then not handle recombination of any of the heavy changes here. Note: the "fork" method only applies to the code we see significant changes in store for, and whose modification will break ProjectA - shared infrastructure stuff for example, we would just modify in place (on our branch) and let the merge sort out. Development is hard, go shopping. Naturally, after not coming to an agreement, we're turning it over to the oracle of power that is SO. Any experience with any of these methods, pain points to watch out for, something new entirely?

    Read the article

  • php Getting the 'new' values in an array

    - by Mark
    Trying to learn about php's arrays today. I have a set of arrays like this: $a = array ( 0 => array ( 'value' => 'America', ), 1 => array ( 'value' => 'England', ), 2 => array ( 'value' => 'Australia', ), ) $b = array ( 0 => array ( 'value' => 'America', ), 1 => array ( 'value' => 'England', ), 2 => array ( 'value' => 'Canada', ), ) I need to get the 'new' subarrays that array b brings to the table. ie, I need to return array ( 'value' => 'Canada', ) I thought I could first merge $a+$b and then compare the result to $a. $merge = array_merge($a,$b); $result = array_diff($merge,$a); But somehow that does not work. It returns array() How come? And thanks!

    Read the article

  • java:25: '.class' expected error while merging arrays

    - by user3677712
    Here is my code, it is asking me to call a class, I am confused as to do this. Noob to java, so any help would be greatly appreciated. line 25 is where the error occurs. This program is merging two arrays together into a new array. public class Merge{ public static void main(String[] args){ int[] a = {1, 1, 4, 5, 7}; int[] b = {2, 4, 6, 8}; int[] mergedArray = merge(a, b); for(int i = 0; i < mergedArray.length; i++){ System.out.print(mergedArray[i] + " "); } } public static int[] merge(int[] a, int[] b){ // WRITE CODE HERE int[] mergedArray = new int[a.length[] + b.length[]]; int i = 0, j = 0, k = 0; while (i < a.length() && j < b.length()) //error occurs at this line { if (a[i] < b[j]) { mergedArray[k] = a[i]; i++; } else { mergedArray[k] = b[j]; j++; } k++; } while (i < a.length()) { mergedArray[k] = a[i]; i++; k++; } while (j < b.length()) { mergedArray[k] = b[j]; j++; k++; } return mergedArray; } } This program is merging two arrays together into a new array.

    Read the article

  • Multiple email accounts in a single personal folder in Outlook 2007

    - by Neoclearyst
    I have an account with on Yahoo! Mail, another on Gmail. In Outlook 2007, I've set them up so that I can access them without having to go to their websites. I've password protected my personal folder, but can't find a way to merge my accounts into one personal folder. When I want to switch between my accounts, I must type my password again. Besides that, I can't check for new mail messages in both accounts at the same time. How do I merge multiple email accounts into one single personal folder on Outlook 2007?

    Read the article

  • Merging Two KML Files to Display Them with Different Marker Icons on Google Maps

    - by Maxim Z.
    Let's say that I have two spreadsheets with addresses. I uploaded these spreadsheets into Google Fusion Tables, geocoded the addresses, and exported the results as KML files. Now, I want to take these two KML files and merge them, while maintaining the location data and using it to map the points with Google Maps. Well, I found a way to easily merge the KML files: import both of them into a "My Maps" map with Google Maps! However, my problem is this: when I do that, all of the locations in my data have the same marker icon on the map. From past experience, I know that these markers can be somehow defined inside the KML files. Is it possible to combine these two KML files while giving one's points one marker icon and the other's points another marker icon? Just in case my question is confusing, what I mean, is giving the first set of points blue markers, for example, and the other set of points red markers, so that they can be overlayed.

    Read the article

  • why use branches in svn?

    - by ajsie
    i know that you could organize your files according to this structure in svn: trunk branches tags that you copy the trunk to a folder in branches if you want to have a seperate development line. later on you merge this branch back to trunk. but i wonder why me and my group should do this. why should one copy the trunk to a branch and work with this copy just to merge it back to the trunk, and mean while the code is frequently updated/commited to stay in sync with the trunk. why not just work with the trunk then? what is the benefits with creating a branch? would be great if someone could shed a light on this topic. thanks in advance

    Read the article

  • Combine Multiple Audio Files into a single higher-quality audio File

    - by namenlos
    BACKGROUND My team gave a demo to a large audience - we recorded the audio of the demo in multiple locations in the room (3) the audio was recorded using cheap laptop microphones I was not involved in the recording of the audio or the demo Both audio files suck in some form the first one is of a recording near the speaker - which clearly gets his voice but the the audience is audience is muffled - also this one is slightly noisy The second recording was done in the middle of the audience - it gets the audience questions clearly but actually gets the speaker rather sometimes well and sometimes poorly (not all the speakers spoke loudly enough to be heard) MY QUESTION Is there any techinque or software which can be used to merge these audio files in such a way that the best qualities of each are preserved. I am NOT asking now to simply merge them together in one track - I've already done that in Audacity and it is certainly better - what I am looking for could be considered closer to how HDR images are created - multiple exposures combined into an enhanced new version which is not simply an average of the inputs. NOTE Am not an "Audio" guy - just a normal user

    Read the article

  • Word Macro: Move Cursor Down a Row

    - by Bryan
    I have a macro which I've been using to merge two cells together in a word table, but what I want to do is to get the cursor to move down by one cell, so that I can repeatedly press the shortcut key to repeat the command over and over. The macro code that I have (shamelessy copied and pasted from a web page), is as follows: Sub MergeWithCellToRight() ' ' MergeWithCellToRight Macro ' ' Dim oRng As Range Dim oCell As Cell Set oCell = Selection.Cells(1) If oCell.ColumnIndex = Selection.Rows(1).Cells.Count Then MsgBox "There is no cell to the right?", vbCritical, "Error" Exit Sub End If Set oRng = oCell.Range oRng.MoveEnd wdCell, 1 oRng.Cells.Merge Selection.Collapse wdCollapseStart End Sub I've attempted to add the following line just before the 'End Sub' statement Selection.MoveDown wdCell, 1 but this generates the error, Run-time error '4120' Bad Parameter whenever I execute the macro. Can anyone tell me how to correct this or what I'm doing wrong?

    Read the article

  • merging video and audio with custom panning

    - by cherouvim
    I have: a video which has mono audio inside a audio (mono) I'd like to merge those two to a single video file containing: video from #1 audio from #1 full left pan + audio #2 full right pan Is this possible in ffmpeg using 1 command? I've tried the following which almost does this but the video/audio gets out of sync: ffmpeg -i video.mp4 -filter_complex "amovie=audio.wav [r] ; [r] amerge" output.mp4 -y I've managed to do it with multiple commands: #1 create right panned audio ffmpeg -i audio.wav -ac 2 -vbr 5 audio-stereo.mp3 -y ffmpeg -i audio-stereo.mp3 -af pan=stereo:c1=c1 audio-right.mp3 -y #2 create left panned video ffmpeg -i video.mp4 -af pan=stereo:c0=c0 video-left.mp4 -y #3 merge the two ffmpeg -i video-left.mp4 -i audio-right.mp3 -filter_complex "amix=inputs=2" video-mixed.mp4 -y It does the job, but is it possible with 1 command?

    Read the article

  • Configure Git to use Beyond Compare for image diff

    - by Barney
    Because we work with a number of sprites, the kind of specialised diff views provided by Beyond Compare would be ideal to see which one of 2 versions I'm after when conflicts arise. I've already configured Git to use Beyond Compare as my primary diff and merge tool as described in their integration guide — it specifically goes into how to configure TortoiseSVN to use it for images, and I've found these articles talking about .gitattributes in general and how to script interactions from a *nix shell — but it's not obvious to me how I can use the advice provided by these guides to make a simple change that would say "use the default diff & merge bindings for files determined to be images, too". For the record, I'm doing all this on Windows :P

    Read the article

  • The new SSIS in SQL2005/SQL2008 are oversized

    - by Ice
    I studied the new MERGE Statement and there is a nice example for importing a flatfile. INSERT <Table> SELECT * FROM OPENROWSET BULK <Import-Flat-File>, <Format-File>... seems to be a good replacment for such a simple job and avoids to build a SSIS-Package. EXEC XP_CMDSHELL bcp <Table or View> out <Flat-File> ... is almost simpler than building an SSIS, isn't it? (I know that the MERGE-Statement doesn't run on a SQL2005)

    Read the article

  • Remove a known network from Windows 8

    - by Edward Brey
    When Windows 8 detects a network based on the assigned IP address, netmask, default gateway, etc., it remembers the network along with the setting you give it as a public or private network. If you change the configuration of a network (e.g. reconfigure your router), Windows may determine you are on a new network and assign it a name of Network 2 or YourAPN 2. This less-than-friendly name shows up in many places in the Windows 8 UI, but unlike the good old days of Windows 7, there doesn't appear to be any UI to merge or delete these networks. What's the best way to merge or delete networks you don't want?

    Read the article

  • Migrate 3 terabytes of files to a new server windows 2003

    - by smackaysmith
    We have a new file server to handle the obscene amount of files generated by the company (PDFs, XLS, DOCs and JPGs). Files being moved to the new server total about 3tb. The problem is we can't take the company down for days to move the files. The other problem is the applications creating all these files have to reference previous files, so we can't simply point them to the new server. Also, there isn't an option to have the applications create files on the new server, but reference the old server for existing files. The servers are x64 win2003 r2. Both servers are on the same subnet. DFS doesn't work. Is there an application that can handle this amount of data to copy the files over, throttle bandwidth, and do a 'merge'? By merge I mean constantly copying over newly created files until the two servers are synched.

    Read the article

  • Git for beginners: The definitive practical guide

    - by Adam Davis
    Ok, after seeing this post by PJ Hyett, I have decided to skip to the end and go with git. So what I need is a beginners practical guide to git. "Beginner" being defined as someone who knows how to handle their compiler, understands to some level what a makefile is, and has touched source control without understanding it very well. "Practical" being defined as this person doesn't want to get into great detail regarding what git is doing in the background, and doesn't even care (or know) that it's distributed. Your answers might hint at the possibilities, but try to aim for the beginner that wants to keep a 'main' repository on a 'server' which is backed up and secure, and treat their local repository as merely a 'client' resource. Procedural note: PLEASE pick one and only one of the below topics and answer it clearly and concisely in any given answer. Don't try to jam a bunch of information into one answer. Don't just link to other resources - cut and paste with attribution if copyright allows, otherwise learn it and explain it in your own words (ie, don't make people leave this page to learn a task). Please comment on, or edit, an already existing answer unless your explanation is very different and you think the community is better served with a different explanation rather than altering the existing explanation. So: Installation/Setup How to install git How do you set up git? Try to cover linux, windows, mac, think 'client/server' mindset. Setup GIT Server with Msysgit on Windows How do you create a new project/repository? How do you configure it to ignore files (.obj, .user, etc) that are not really part of the codebase? Working with the code How do you get the latest code? How do you check out code? How do you commit changes? How do you see what's uncommitted, or the status of your current codebase? How do you destroy unwanted commits? How do you compare two revisions of a file, or your current file and a previous revision? How do you see the history of revisions to a file? How do you handle binary files (visio docs, for instance, or compiler environments)? How do you merge files changed at the "same time"? How do you undo (revert or reset) a commit? Tagging, branching, releases, baselines How do you 'mark' 'tag' or 'release' a particular set of revisions for a particular set of files so you can always pull that one later? How do you pull a particular 'release'? How do you branch? How do you merge branches? How do you resolve conflicts and complete the merge? How do you merge parts of one branch into another branch? What is rebasing? How do I track remote branches? How can I create a branch on a remote repository? Other Describe and link to a good gui, IDE plugin, etc that makes git a non-command line resource, but please list its limitations as well as its good. msysgit - Cross platform, included with git gitk - Cross platform history viewer, included with git gitnub - OS X gitx - OS X history viewer smartgit - Cross platform, commercial, beta tig - console GUI for Linux qgit - GUI for Windows, Linux Any other common tasks a beginner should know? Git Status tells you what you just did, what branch you have, and other useful information How do I work effectively with a subversion repository set as my source control source? Other git beginner's references git guide git book git magic gitcasts github guides git tutorial Progit - book by Scott Chacon Git - SVN Crash Course Delving into git Understanding git conceptually I will go through the entries from time to time and 'tidy' them up so they have a consistent look/feel and it's easy to scan the list - feel free to follow a simple "header - brief explanation - list of instructions - gotchas and extra info" template. I'll also link to the entries from the bullet list above so it's easy to find them later.

    Read the article

  • Microsoft SyncFramework - Sync different tables into one

    - by evnu
    Hello, we are trying to get the Microsoft SyncFramework running in our application to synchronize an oracle db with a mobile device. Problem The queries that we need to gather the data on the oracle db take much time (and we haven't found a way to speed them up yet), so we try to split them up in as much portions as possible. One big part of the whole problem is, that we need different information out of one big table, that bloats a query if combined. Unfortunately, the SyncFramework allows only one TableAdapter per SyncTable. Now this is a problem for our application: If we were able to use more than one TableAdapter per SyncTable, we could easily spread the queries in a more efficient way. Using one query per Table which combines all the needed data takes way too much time. Ideas I thought of creating different TableAdapters for each one of the required queries and then merge the resulting datasets afterwards (preferably on the server). This seems to work, but is a rather awkward solution. Does someone of you know a better solution? Or do you have some ideas that could help? Thanks in advance, evnu EDIT: So, I implemented the merge solution. If you are interested, take a look at the following code. I'll give more details if there are questions. <WebMethod()> _ Public Function GetChanges(ByVal groupMetadata As SyncGroupMetadata, ByVal syncSession As SyncSession) As SyncContext Dim stream As MemoryStream Dim format As BinaryFormatter = New BinaryFormatter Dim anchors As Dictionary(Of String, Byte()) ' keep track of the tables that will be updated Dim addTables As Dictionary(Of String, List(Of SyncTableMetadata)) = New Dictionary(Of String, List(Of SyncTableMetadata)) ' list of all present anchors Dim allAnchors As Dictionary(Of String, Byte()) = New Dictionary(Of String, Byte()) ' fill allAnchors - deserialize all given anchors For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata If Table.LastReceivedAnchor Is Nothing Or Table.LastReceivedAnchor.IsNull Then Continue For stream = New MemoryStream(Table.LastReceivedAnchor.Anchor) anchors = format.Deserialize(stream) For Each item As KeyValuePair(Of String, Byte()) In anchors allAnchors.Add(item.Key, item.Value) Next stream.Dispose() Next For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata If allAnchors.ContainsKey(Table.TableName) Then Table.LastReceivedAnchor.Anchor = allAnchors(Table.TableName) End If Dim addSyncTables As List(Of SyncTableMetadata) If syncSession.SyncParameters.Contains(Table.TableName) Then Dim tableNames() As String = syncSession.SyncParameters(Table.TableName).Value.ToString.Split(":") addSyncTables = New List(Of SyncTableMetadata) For Each tableName As String In tableNames Dim newSynctable As SyncTableMetadata = New SyncTableMetadata newSynctable.TableName = tableName If allAnchors.ContainsKey(tableName) Then Dim anker As SyncAnchor = New SyncAnchor(allAnchors(tableName)) newSynctable.LastReceivedAnchor = anker Else newSynctable.LastReceivedAnchor = Nothing End If newSynctable.SyncDirection = Table.SyncDirection addSyncTables.Add(newSynctable) Next addTables.Add(Table.TableName, addSyncTables) End If Next ' add the newly created synctables For Each item As KeyValuePair(Of String, List(Of SyncTableMetadata)) In addTables For Each Table As SyncTableMetadata In item.Value groupMetadata.TablesMetadata.Add(Table) Next Next ' fire queries Dim context As SyncContext = servSyncProvider.GetChanges(groupMetadata, syncSession) ' merge resulting datasets For Each item As KeyValuePair(Of String, List(Of SyncTableMetadata)) In addTables For Each Table As SyncTableMetadata In item.Value If context.DataSet.Tables.Contains(Table.TableName) Then If Not context.DataSet.Tables.Contains(item.Key) Then Dim tmp As DataTable = context.DataSet.Tables(Table.TableName).Copy tmp.TableName = item.Key context.DataSet.Tables.Add(tmp) Else context.DataSet.Tables(item.Key).Merge(context.DataSet.Tables(Table.TableName)) context.DataSet.Tables.Remove(Table.TableName) End If End If Next Next ' create new anchors Dim allAnchorsDict As Dictionary(Of String, Byte()) = New Dictionary(Of String, Byte()) For Each Table As SyncTableMetadata In groupMetadata.TablesMetadata allAnchorsDict.Add(Table.TableName, context.NewAnchor.Anchor) Next stream = New MemoryStream format.Serialize(stream, allAnchorsDict) context.NewAnchor.Anchor = stream.ToArray stream.Dispose() Return context End Function

    Read the article

  • ILMerge - Unresolved assembly reference not allowed: System.Core

    - by Steve Michelotti
    ILMerge is a utility which allows you the merge multiple .NET assemblies into a single binary assembly more for convenient distribution. Recently we ran into problems when attempting to use ILMerge on a .NET 4 project. We received the error message: An exception occurred during merging: Unresolved assembly reference not allowed: System.Core.     at System.Compiler.Ir2md.GetAssemblyRefIndex(AssemblyNode assembly)     at System.Compiler.Ir2md.GetTypeRefIndex(TypeNode type)     at System.Compiler.Ir2md.VisitReferencedType(TypeNode type)     at System.Compiler.Ir2md.GetMemberRefIndex(Member m)     at System.Compiler.Ir2md.PopulateCustomAttributeTable()     at System.Compiler.Ir2md.SetupMetadataWriter(String debugSymbolsLocation)     at System.Compiler.Ir2md.WritePE(Module module, String debugSymbolsLocation, BinaryWriter writer)     at System.Compiler.Writer.WritePE(String location, Boolean writeDebugSymbols, Module module, Boolean delaySign, String keyFileName, String keyName)     at System.Compiler.Writer.WritePE(CompilerParameters compilerParameters, Module module)     at ILMerging.ILMerge.Merge()     at ILMerging.ILMerge.Main(String[] args) It turns out that this issue is caused by ILMerge.exe not being able to find the .NET 4 framework by default. The answer was ultimately found here. You either have to use the /lib option to point to your .NET 4 framework directory (e.g., “C:\Windows\Microsoft.NET\Framework\v4.0.30319” or “C:\Windows\Microsoft.NET\Framework64\v4.0.30319”) or just use an ILMerge.exe.config file that looks like this: 1: <configuration> 2: <startup useLegacyV2RuntimeActivationPolicy="true"> 3: <requiredRuntime safemode="true" imageVersion="v4.0.30319" version="v4.0.30319"/> 4: </startup> 5: </configuration> This was able to successfully resolve my issue.

    Read the article

  • SQL SERVER – Subquery or Join – Various Options – SQL Server Engine knows the Best

    - by pinaldave
    This is followup post of my earlier article SQL SERVER – Convert IN to EXISTS – Performance Talk, after reading all the comments I have received I felt that I could write more on the same subject to clear few things out. First let us run following four queries, all of them are giving exactly same resultset. USE AdventureWorks GO -- use of = SELECT * FROM HumanResources.Employee E WHERE E.EmployeeID = ( SELECT EA.EmployeeID FROM HumanResources.EmployeeAddress EA WHERE EA.EmployeeID = E.EmployeeID) GO -- use of in SELECT * FROM HumanResources.Employee E WHERE E.EmployeeID IN ( SELECT EA.EmployeeID FROM HumanResources.EmployeeAddress EA WHERE EA.EmployeeID = E.EmployeeID) GO -- use of exists SELECT * FROM HumanResources.Employee E WHERE EXISTS ( SELECT EA.EmployeeID FROM HumanResources.EmployeeAddress EA WHERE EA.EmployeeID = E.EmployeeID) GO -- Use of Join SELECT * FROM HumanResources.Employee E INNER JOIN HumanResources.EmployeeAddress EA ON E.EmployeeID = EA.EmployeeID GO Let us compare the execution plan of the queries listed above. Click on image to see larger image. It is quite clear from the execution plan that in case of IN, EXISTS and JOIN SQL Server Engines is smart enough to figure out what is the best optimal plan of Merge Join for the same query and execute the same. However, in the case of use of Equal (=) Operator, SQL Server is forced to use Nested Loop and test each result of the inner query and compare to outer query, leading to cut the performance. Please note that here I no mean suggesting that Nested Loop is bad or Merge Join is better. This can very well vary on your machine and amount of resources available on your computer. When I see Equal (=) operator used in query like above, I usually recommend to see if user can use IN or EXISTS or JOIN. As I said, this can very much vary on different system. What is your take in above query? I believe SQL Server Engines is usually pretty smart to figure out what is ideal execution plan and use it. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Joins, SQL Optimization, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Merging Waterfall and Agile – Getting the Worst of Both Worlds

    - by Nick Harrison
    Many people have seen and appreciate the elegance and practicality of agile methodologies.   Sadly there is still not widespread adoption.   There is still push back from many directions and from many different sources.   Some people don't understand how it is supposed to work. Some people don't believe that it could possibly work. Some people mistakenly believe that it is just code for a lazy project team trying to wiggle out of structure Some people mistakenly believe that it can work only with a very small highly trained team Some people are afraid of the control that they feel they will be losing. I have seen some people try to merge agile and water fall hoping to achieve the best of both worlds.   Unfortunately, the reality is that you end up with the worst of both worlds.   And they both can get pretty bad. Another Sad Reality Some people in an effort to get buy in for following an Agile Methodology have attempted to merge these two practices.   Sometimes this may stem from trying to assuage individual fears that they are not losing relevance.   Sometimes it may be to meet contractual obligations or to fulfill regulatory requirements.   Sometimes may not know better. These two approaches to software development cannot coexist on the same project. Let's review the main tenants of the Agile Manifesto: Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more. Meanwhile the main tenants of the Waterfall Approach could be summarized as: Processes and procedures over individuals Comprehensive documentation proves that the software works Well defined contracts and negotiations protects the customer relationship If the plan is made right, there should be no change  Merging these two approaches will always end badly.

    Read the article

  • Is there any way to stop a window's title bar merging with the panel when maximised?

    - by Richard Turner
    I'm working on a desktop machine with plenty of screen real-estate, so I don't need my windows' title bars to merge with the global menu bar when the windows are maximised. Moreover, I'm working on a dual-screen set-up, so the fact that a window is maximised doesn't mean that it's the only window visible. Before Unity I'd switch to a maximised window by clicking on its title bar, or close the window, even though it isn't focused, by clicking on its close button; I can no longer do this because the title bar is missing and the global menu bar is empty on that screen. This isn't a huge problem - I can click on some of the window's chrome to focus it - but it's unintuitive and it's forcing me to relearn my mousing behaviour. I'd like to turn-off the merging of title and global menu bars, but how? EDIT: I simply want the title bar of the window NOT to merge with the top panel whenever I maximize a Window. The global menu should stay in the top panel as far as I am concerned. Current it maximizes like this I want it to maximize like this (In that screeny the unmaximized Window has been resized to take rest of the space)

    Read the article

  • Public EC Meeting scheduled for 20 November

    - by Heather VanCura
    The minutes and materials from the October 2012 JCP EC Teleconference are now available.  The next JCP EC Meeting, and the first EC Meeting under JCP 2.9, with the Merged EC, is scheduled for 20 November.  The second hour of this meeting will be open to the public at 3:00 PM PST. The agenda includes  JSR 355,  EC merge implementation report, JSR 358 (JCP.next.3) status report, JCP 2.8 status update and community audit program.  Details are below. We hope you will join us, but if you cannot attend, not to worry--the recording and materials will also be public on the JCP.org multimedia page. Meeting details Date & Time Tuesday November 20, 2012, 3:00 - 4:00 pm PST Location Teleconference Dial-in +1 (866) 682-4770 (US) Conference code: 627-9803 Security code: 52732 ("JCPEC" on your phone handset) For global access numbers see http://www.intercall.com/oracle/access_numbers.htm Or +1 (408) 774-4073 WebEx Browse for the meeting from https://jcp.webex.com No registration required (enter your name and email address) Password: JCPEC Agenda JSR 355 (the EC merge) implementation report JSR 358 (JCP.next.3) status report 2.8 status update and community audit program Discussion/Q&A Note The call will be recorded and the recording published on jcp.org, so those who are unable to join in real-time will still be able to participate.

    Read the article

  • What's a good approach to adding debug code to your application when you want more info about what's going wrong?

    - by Andrei
    When our application doesn't work the way we expect it to (e.g. throws exceptions etc.), I usually insert a lot of debug code at certain points in the application in order to get a better overview of what exactly is going on, what the values for certain objects are, to better trace where this error is triggered from. Then I send a new installer to the user(s) that are having the problem and if the problem is triggered again I look at the logs and see what they say. But I don't want all this debug code to be in the production code, since this would create some really big debug files with information that is not always relevant. The other problem is that our code base changes, and the next time, the same debug code might have to go in different parts of the application. Questions Is there a way to merge this debug code within the production code only when needed and have it appear at the correct points within the application? Can it be done with a version control system like git so that all would be needed is a git merge? P.S. The application I'm talking about now is .NET, written in C#.

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >