Search Results

Search found 8335 results on 334 pages for 'msbuild extension pack'.

Page 4/334 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Adding AllowPartiallyTrustedCallers with MSBuild

    - by Ben Rice
    I am using CC.Net with MSBuild tasks to build an application that is composed of a number of solutions and projects. We are using the AssemblyInfo MSBuild Community task to update version info in AssemblyInfo.cs. Unfortunately the AllowPartiallyTrustedCallers attribute doesn't get in and the AssemblyInfo task tells me that the AllowPartiallyTrustedCallers attribute is not supported by the task. Is there any way to add that attribute through MSBuild without having to resort to a custom task that just tacks the line at the end of the file after-the-fact?

    Read the article

  • MSBuild fails, but building inside Visual Studio works fine

    - by Matt
    C#, .NET 2.0 I have an ASP.NET website in a solution, with 2 other projects (used as library references). When I build (debug or release) in Visual Studio, everything works fine. However, building with MSBuild fails. This build had been working (it's actually invoked via a nAnt task). The only thing that has changed is that I have a new user control whose Type I am referencing in my code behind. The offending code is in my ASPX code behind. MessageAlert is the UserControl: MessageAlert userControl = this.LoadControl("~/UserControls/MessageAlert.ascx") as MessageAlert; userControl.UserMessage = message; this.UserMessages.Controls.Add(userControl); In order to get Visual Studio to recognize the type 'MessageAlert' I had to: 1) Set the ClassName="MessageAlert" in the @Control markup at the top of the user control (because using the auto-generated UserControls_MessageAlert wasn't working either) 2) Register the user control in the markup of my ASPX, using an @Register 3) Add a "using ASP" to the top of my code behind After those steps, I could successfully reference the MessageAlert type in my codebehind from visual studio. But from MSBuild I get "The type or namespace name 'MessageAlert' could not be found (are you missing a using directive or an assembly reference?) " The MSBuild execution is very simple - it points the the very same solution file and sets the configuration property to release. It seems, based on the # of steps I had to go through to get Type references to MessageAlert in Visual Studio, that there is something missing in the MSBuild process. But what? Doesn't Visual Studio in fact invoke MSBuild behind the scenes? Is there a better way to reference a UserControl type in the code behind of an ASPX? EDIT: To clarify, the MessageAlert user control is not in the other referenced assemblies/projects. I mentioned them because, together with the website, the compose the Solution file, which is the same sln file being built by MS Build.

    Read the article

  • MSBuild 4 fails to build VS2008 csproj due to 1 compiler warning

    - by David White
    We have a VS2008 CS DLL project targeting .NET 3.5. It builds successfully on our CI server when using MSBuild 3.5. When CI is upgraded to use MSBuild 4.0, the same project fails to build, due to 1 warning message: c:\WINDOWS\Microsoft.NET\Framework\v3.5\Microsoft.Common.targets(1418,9): warning MSB3283: Cannot find wrapper assembly for type library "ADODB". The warning does not occur with MSBuild 3.5, and I'm surprised that it results in Build FAILED. We do not have the project set to treat warnings as errors. All our other projects build successfully with either version of MSBuild.

    Read the article

  • How to prevent MSBUILD from copying dependent GAC assemblies to bin

    - by Matt Wrock
    I have a msbuild task that builds my solution and I am migrating it from .net 3.5 to 4.0. I have some dependent DLLs that have Local Copy set to true. The 4.0 version of msbuild is not only copying the dependent DLL (which I want), it is also copying all dependent assemblies of that DLL from the 32 bit version of the GAC to my bin. Not only do I not want these files being copied from the GAC, I especially do not want the 32 bit versions for this 64 bit build. Has the behavior changed in msbuild 4.0? And does anyone know how to force msbuild to use the behavior in 3.5?

    Read the article

  • Visual Studio msbuild

    - by user62958
    I have a question regarding the commandline options of msbuild. I am currently using msbuild to build projects using the existing solution files. These solution files have references to external dll which have different paths on each machine. I am currently writing a build script and passing the specific path to the project file via the /p: switch of msbuild. My current build line is: msbuild test.sln /p:ReferencePath="c:\abc" /p:ReferencePath="c:\rca" What i have noticed that Reference Path now contains only c:\rca and not c:\abc. this is causing problems for me since, the external dlls lie in two different directorys. I am allowed to keep multiple reference paths via visual studio, but not via the commandline. Is there any known way by which i can do this

    Read the article

  • Use msbuild community tasks without installing

    - by mickyjtwin
    In our developer environment, no users have administration rights. As such, it's not possible to install MSBuild.CommunityTasks without getting admin to do so. What I'm wondering is similar to NAnt, i.e. is it possible to include the files in your solution directory, and just reference it from there? This way, will not matter if the person has it installed, when they checkout latest code, the msbuild community files will come with, and solution will build. \SolutionDir\{solution}.sln \SolutionDir\Project\{files} \SolutionDir\MSBuild.Community\files

    Read the article

  • Error while running Scheduled Task from MSBUILD script

    - by SVI
    I am running a MSBUILD script which disables and enabled a scheduled task from command line. I am getting following error. No idea, what 'exited with code 1' means. When I try to copy paste the line in command prompt, it works perfectly. I doesnot run in MSBuild script which is in turn called by CruiseControl.NET SNIPPET FROM MSBUILD file is ERROR is The command "C:\WINDOWS\system32\schtasks.exe /S servername /Change /RU SYSTEM /TN "MyScheduledtask" /DISABLE" exited with code 1.

    Read the article

  • SQL SERVER – SQL Server 2008 with Service Pack 2

    - by pinaldave
    SQL Server 2008 Service Pack 2 has been already released earlier. I suggest that all of you who are running SQL Server 2008 I suggest you updated to SQL Server 2008 Service Pack 2. Download SQL Server 2008 – Service Pack 2 from here. Please note, this is not SQL Server 2008 R2 but it is SQL Server 2008 – Service Pack 2. Test Lab Guide of sQL Server 2008 with Service Pack 2 is also released by Microsoft. This document contains an introduction to SQL Server 2008 with Service Pack 2 and step-by-step instructions for extending the Base Configuration test lab to include a SQL Server 2008 with Service Pack 2 server. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Collection RemoveAll Extension Method

    - by João Angelo
    I had previously posted a RemoveAll extension method for the Dictionary<K,V> class, now it’s time to have one for the Collection<T> class. The signature is the same as in the corresponding method already available in List<T> and the implementation relies on the RemoveAt method to perform the actual removal of each element. Finally, here’s the code: public static class CollectionExtensions { /// <summary> /// Removes from the target collection all elements that match the specified predicate. /// </summary> /// <typeparam name="T">The type of elements in the target collection.</typeparam> /// <param name="collection">The target collection.</param> /// <param name="match">The predicate used to match elements.</param> /// <exception cref="ArgumentNullException"> /// The target collection is a null reference. /// <br />-or-<br /> /// The match predicate is a null reference. /// </exception> /// <returns>Returns the number of elements removed.</returns> public static int RemoveAll<T>(this Collection<T> collection, Predicate<T> match) { if (collection == null) throw new ArgumentNullException("collection"); if (match == null) throw new ArgumentNullException("match"); int count = 0; for (int i = collection.Count - 1; i >= 0; i--) { if (match(collection[i])) { collection.RemoveAt(i); count++; } } return count; } }

    Read the article

  • RemoveAll Dictionary Extension Method

    - by João Angelo
    Removing from a dictionary all the elements where the keys satisfy a set of conditions is something I needed to do more than once so I implemented it as an extension method to the IDictionary<TKey, TValue> interface. Here’s the code: public static class DictionaryExtensions { /// <summary> /// Removes all the elements where the key match the conditions defined by the specified predicate. /// </summary> /// <typeparam name="TKey"> /// The type of the dictionary key. /// </typeparam> /// <typeparam name="TValue"> /// The type of the dictionary value. /// </typeparam> /// <param name="dictionary"> /// A dictionary from which to remove the matched keys. /// </param> /// <param name="match"> /// The <see cref="Predicate{T}"/> delegate that defines the conditions of the keys to remove. /// </param> /// <exception cref="ArgumentNullException"> /// dictionary is null /// <br />-or-<br /> /// match is null. /// </exception> /// <returns> /// The number of elements removed from the <see cref="IDictionary{TKey, TValue}"/>. /// </returns> public static int RemoveAll<TKey, TValue>( this IDictionary<TKey, TValue> dictionary, Predicate<TKey> match) { if (dictionary == null) throw new ArgumentNullException("dictionary"); if (match == null) throw new ArgumentNullException("match"); var keysToRemove = dictionary.Keys.Where(k => match(k)).ToList(); if (keysToRemove.Count == 0) return 0; foreach (var key in keysToRemove) { dictionary.Remove(key); } return keysToRemove.Count; } }

    Read the article

  • Windows 7 Home Premium Family Pack upgrade + Windows Anytime Upgrade to Ultimate

    - by krebstar
    Is an install of Windows 7 Home Premium Family Pack license upgradeable? Suppose I'm able to install Windows 7 Home Premium on my computer using a Family Pack key. Would I then be able to upgrade to Windows 7 Ultimate using the Anytime Upgrade option? Considering that the Family Pack only uses one key to activate three PCs, what happens when I upgrade one of those computers with Anytime Upgrade? Would I be able to use the Anytime Upgrade key on all three computers? Or just one computer? Is upgrading this license key even possible?

    Read the article

  • System File Checker vs Service Pack Reinstall

    - by Nixphoe
    When trying to repair slow workstations, I've found that running sfc /scannow helps quite a lot in a few of my environments running really old computers. I've also seen recommendations of reinstalling the last service pack after software installation to help keep the system stable. That makes sense as it would replace a lot of the dll files with the ones that would come with the service pack. They both seem to do the same thing, but SFC some times will ask for a disk, where the Service Pack will not. What is the main difference between the two?

    Read the article

  • Windows 7 : mise à jour du Microsoft Desktop Optimization Pack, le pack de virtualisation et de déploiement

    Windows 7 : mise à jour de Microsoft Desktop Optimization Pack Le pack de virtualisation et de déploiement Microsoft vient de procéder à une mise à jour de son pack de solutions de déploiement et de virtualisation Microsoft Destop Optimization Pack (MDOP) La mise à jour de MDOP porte principalement sur MED-V (Microsoft Enterprise Desktop Virtualisation) qui est désormais disponible en version 2.0 et sur APP-V 4 dont le Service Pack 1 est désormais disponible. Le SP1 de Microsoft APP-V 4 rend le processus de virtualisation des applications plus facile et plus rapide grâce à l'intégration du « package ...

    Read the article

  • Thoughts on C# Extension Methods

    - by Damon
    I'm not a huge fan of extension methods.  When they first came out, I remember seeing a method on an object that was fairly useful, but when I went to use it another piece of code that method wasn't available.  Turns out it was an extension method and I hadn't included the appropriate assembly and imports statement in my code to use it.  I remember being a bit confused at first about how the heck that could happen (hey, extension methods were new, cut me some slack) and it took a bit of time to track down exactly what it was that I needed to include to get that method back.  I just imagined a new developer trying to figure out why a method was missing and fruitlessly searching on MSDN for a method that didn't exist and it just didn't sit well with me. I am of the opinion that if you have an object, then you shouldn't have to include additional assemblies to get additional instance level methods out of that object.  That opinion applies to namespaces as well - I do not like it when the contents of a namespace are split out into multiple assemblies.  I prefer to have static utility classes instead of extension methods to keep things nicely packaged into a cohesive unit.  It also makes it abundantly clear where utility methods are used in code.  I will concede, however, that it can make code a bit more verbose and lengthy.  There is always a trade-off. Some people harp on extension methods because it breaks the tenants of object oriented development and allows you to add methods to sealed classes.  Whatever.  Extension methods are just utility methods that you can tack onto an object after the fact.  Extension methods do not give you any more access to an object than the developer of that object allows, so I say that those who cry OO foul on extension methods really don't have much of an argument on which to stand.  In fact, I have to concede that my dislike of them is really more about style than anything of great substance. One interesting thing that I found regarding extension methods is that you can call them on null objects. Take a look at this extension method: namespace ExtensionMethods {   public static class StringUtility   {     public static int WordCount(this string str)     {       if(str == null) return 0;       return str.Split(new char[] { ' ', '.', '?' },         StringSplitOptions.RemoveEmptyEntries).Length;     }   }   } Notice that the extension method checks to see if the incoming string parameter is null.  I was worried that the runtime would perform a check on the object instance to make sure it was not null before calling an extension method, but that is apparently not the case.  So, if you call the following code it runs just fine. string s = null; int words = s.WordCount(); I am a big fan of things working, but this seems to go against everything I've come to know about instance level methods.  However, an extension method is really a static method masquerading as an instance-level method, so I suppose it would be far more frustrating if it failed since there is really no reason it shouldn't succeed. Although I'm not a fan of extension methods, I will say that if you ever find yourself at an impasse with a die-hard fan of either the utility class or extension method approach, then there is a common ground.  Extension methods are defined in static classes, and you call them from those static classes as well as directly from the objects they extend.  So if you build your utility classes using extension methods, then you can have it your way and they can have it theirs. 

    Read the article

  • MSBuild: Add additional files to compile without altering the project file

    - by Craig Norton
    After looking around I can't find a simple answer to this problem. I am trying to create an MSBuild file to allow me to easily use SpecFlow and NUnit within Visual Studio 2010 express. The file below is not complete this is just a proof of concept and it needs to be made more generic. <Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <PropertyGroup> <BuildDependsOn> BuildSolution; SpecFlow; BuildProject; NUnit; </BuildDependsOn> </PropertyGroup> <PropertyGroup> <Solution>C:\Users\Craig\Documents\My Dropbox\Cells\Cells.sln</Solution> <CSProject>C:\Users\Craig\Documents\My Dropbox\Cells\Configuration\Configuration.csproj</CSProject> <DLL>C:\Users\Craig\Documents\My Dropbox\Cells\Configuration\bin\Debug\Configuration.dll</DLL> </PropertyGroup> <Target Name="Build" DependsOnTargets="$(BuildDependsOn)"> </Target> <Target Name="BuildSolution"> <MSBuild Projects="$(Solution)" Properties="Configuration=Debug" /> </Target> <Target Name="SpecFlow"> <Exec Command="SpecFlow generateall $(CSProject)" /> </Target> <Target Name="BuildProject"> <MSBuild Projects="$(CSProject)" Properties="Configuration=Debug" /> </Target> <Target Name="NUnit"> <Exec Command='NUnit /run "$(DLL)"' /> </Target> The SpecFlow Task looks in the .csproj file and creates a SpecFlowFeature1.feature.cs. I need to include this file when building the .csproj so that NUnit can use it. I know I could modify (either directly or on a copy) the .csproj file to include the generated file but I'd prefer to avoid this. My question is: Is there a way to use the MSBuild Task to build the project file and tell it to include an additional file to include in the build? Thank you.

    Read the article

  • MSBuild Script and VS2010 publish apply Web.config Transform

    - by Jason
    So, I have VS 2010 installed and am in the process of modifying my MSBuild script for our TeamCity build integration. Everything is working great with one exception. How can I tell MSBuild that I want to apply the Web.conifg transform files that I've created when I publish the build... I have the following which produces the compiled web site but, it outputs a Web.config, Web.Debug.config and, Web.Release.config files (All 3) to the compiled output directory. In studio when I perform a publish to file system it will do the transform and only output the Web.config with the appropriate changes... <Target Name="CompileWeb"> <MSBuild Projects="myproj.csproj" Properties="Configuration=Release;" /> </Target> <Target Name="PublishWeb" DependsOnTargets="CompileWeb"> <MSBuild Projects="myproj.csproj" Targets="ResolveReferences;_CopyWebApplication" Properties="WebProjectOutputDir=$(OutputFolder)$(WebOutputFolder); OutDir=$(TempOutputFolder)$(WebOutputFolder)\;Configuration=Release;" /> </Target> Any help would be great..! I know this can be done by other means but I would like to do this using the new VS 2010 way if possible

    Read the article

  • .NET projects build automation with NAnt/MSBuild + SVN

    - by petr k.
    Hi everyone, for quite a while now, I've been trying to figure out how to setup an automated build process at our shop. I've read many posts and guides on this matter and none of them really fits my specifics needs. My SVN repository is laid out as follows \projects \projectA (a product) \tags \1.0.0.1 \1.0.0.2 ... \trunk \src \proj1 (a VS C# project) \proj2 \documentation Then I have a network share, with a folder for each project (product), which in turn contains the binaries, written documentation and the generated API documentation (via NDoc - each project may have an .ndoc file in the repository) for every historical version (from the tags SVN folder) and for the latest version as well (from the trunk). Basically, what I want to do in a scheduled batch build are these steps: examine the project's SVN folder and identify tags not present in the network share for each of these tags check out the tag folder build (with Release config) copy the resulting binaries to the network share search for .ndoc files generate CHM files via NDoc copy the resulting CHM files to the network share do the same as in 2., but for the HEAD revision of trunk Now, the trouble is, I have no idea where to start. I do not keep .sln files in the repository, but I am able to replace these with MSBuild files which in turn build the C# projects belonging to the specific product. I guess the most troubling part is the examination of the repository for tags which have not been processed yet - i.e. searching the tags and comparing them to a project's directory structure on the network share. I have no idea how to do that in any of the build tools (NAnt, MSBuild). Could you please provide me with some pointers on how to approach this task as a whole and in detail as well? I do not care if I use NAnt, MSBuild, or both. I am aware that this might be rather complex, but every idea and NAnt/MSBuild snippet will be a great help. Thanks in advance.

    Read the article

  • MSBuild Validating Properties

    - by Brian Gillespie
    I'm working on a reusable MSBuild Target that will be consumed by several other tasks. This target requires that several properties be defined. What's the best way to validate that properties are defined, throwing an Error if the are not? Two attempts that I almost like: <?xml version="1.0" encoding="utf-8" ?> <Project ToolsVersion="3.5" DefaultTarget="Release" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Target Name="Release"> <Error Text="Property PropA required" Condition="'$(PropA)' == ''"/> <Error Text="Property PropB required" Condition="'$(PropB)' == ''"/> <!-- The body of the task --> </Target> </Project> Here's an attempt at batching. It's ugly because of the extra "Name" parameter. Is it possible to use the Include attribute instead? <?xml version="1.0" encoding="utf-8" ?> <Project ToolsVersion="3.5" DefaultTarget="Release" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Target Name="Release"> <!-- MSBuild BuildInParallel="true" Projects="@(ProjectsToBuild)"/ --> <ItemGroup> <RequiredProperty Include="PropA"><Name>PropA</Name></RequiredProperty> <RequiredProperty Include="PropB"><Name>PropB</Name></RequiredProperty> <RequiredProperty Include="PropC"><Name>PropC</Name></RequiredProperty> </ItemGroup> <Error Text="Property %(RequiredProperty.Name) required" Condition="'$(%(RequiredProperty.Name))' == ''" /> </Target> </Project>

    Read the article

  • AjaxControlToolkit Resource Files Not Copied To Output in MSBuild Script

    - by Dario Solera
    I'm new to MSBuild, but I managed to setup the following simple script: <Project ToolsVersion="3.5" DefaultTargets="Compile" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <PropertyGroup> <Configuration Condition="'$(Configuration)' == ''">Debug</Configuration> </PropertyGroup> <ItemGroup> <SolutionRoot Include=".." /> <BuildArtifacts Include=".\Artifacts\" /> <SolutionFile Include="..\SolutionName.sln" /> </ItemGroup> <Target Name="Clean"> <RemoveDir Directories="@(BuildArtifacts)" /> </Target> <Target Name="Init" DependsOnTargets="Clean"> <MakeDir Directories="@(BuildArtifacts)" /> </Target> <Target Name="Compile" DependsOnTargets="Init"> <MSBuild Projects="@(SolutionFile)" Properties="OutDir=%(BuildArtifacts.FullPath);Configuration=$(Configuration)" /> <MakeDir Directories="%(BuildArtifacts.FullPath)\_PublishedWebsites\RDE.XAP.UnifiedGui.Web\Temp" /> </Target> </Project> The solution has 23 projects, 4 of which are WebApps. Now, the script works fine and the output is generated correctly. The only problem I counter is with two WebApp projects in the solution that use the AJAX Control Toolkit. The toolkit has a set of folders (e.g. ar, it, es, fr) that contain localized resources. These folders are not copied in the bin directory of the WebApps when the solution is built in MSBuild, but they are copied when it is built in Visual Studio. How can I solve this in a clean manner? I know I could write a (quite convoluted) task that copies the directories after the compile, but it does not seem the right solution to me. Also, neither Google, SO and MSDN could provide more details on this kind of issue.

    Read the article

  • MSBuild file for deployment process

    - by Lee Englestone
    I could do with some pointers, code examples or references that may help me do the following in an msbuild file to help speed up the deployment process.. This scenario involves getting a developers 'local' version onto a 'development' server.. Increment a developers local Web Applications Assembly version number Publish a developers local Web Application files somewhere .rar the publsihed files or folder into the format v[IncrementedAssemblyNumber].rar Copy the .rar to somewhere Backup (.rar) the existing live website folder (located elsewhere) in the format Pre_v[IncrementedAssemblyNumber].rar Move the backed up .rar to a /Backup folder. Overwrite the development web files with the published local web files Should be simple for all those MSBUILD Gurus out there. Like I said, answers or 'Good and applicable' links would be much appreciated. Also i'm thinking of getting one of the MSbuild books. From what I can tell there are 2, possibly 3 contenders. I am not using TFS. Can anyone recommend a book for beginning MSBUILD? Ideally from people that have read more than one book on the subject. Cheers, -- Lee

    Read the article

  • Replace .sln with MSBuild and wrap contained projects into targets

    - by Filburt
    I'd like to create a MSBuild project that reflects the project dependencies in a solution and wraps the VS projects inside reusable targets. The problem I like solve doing this is to svn-export, build and deploy a specific assembly (and its dependencies) in an BizTalk application. My question is: How can I make the targets for svn-exporting, building and deploying reusable and also reuse the wrapped projects when they are built for different dependencies? I know it would be simpler to just build the solution and deploy only the assemblies needed but I'd like to reuse the targets as much as possible. The parts The project I like to deploy <Project DefaultTargets="Deploy" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <PropertyGroup> <ExportRoot Condition="'$(Export)'==''">Export</ExportRoot> </PropertyGroup> <Target Name="Clean_Export"> <RemoveDir Directories="$(ExportRoot)\My.Project.Dir" /> </Target> <Target Name="Export_MyProject"> <Exec Command="svn export svn://xxx/trunk/Biztalk2009/MyProject.btproj --force" WorkingDirectory="$(ExportRoot)" /> </Target> <Target Name="Build_MyProject" DependsOnTargets="Export_MyProject"> <MSBuild Projects="$(ExportRoot)\My.Project.Dir\MyProject.btproj" Targets="Build" Properties="Configuration=Release"></MSBuild> </Target> <Target Name="Deploy_MyProject" DependsOnTargets="Build_MyProject"> <Exec Command="BTSTask AddResource -ApplicationName:CORE -Source:MyProject.dll" /> </Target> </Project> The projects it depends upon look almost exactly like this (other .btproj and .csproj).

    Read the article

  • MSBuild / PowerShell: Copy SQL Server 2012 database to SQL Azure via BACPAC (for Continuous Integration)

    - by giveme5minutes
    I'm creating a continuous integration MSBuild script which copies a database in on-premise SQL Server 2012 to SQL Azure. Easy right? Methods After a fair bit of research I've come across the following methods: Use PowerShell to access the DAC library directly, then use the MSBuild PowerShell extension to wrap the script. This would require installing PowerShell 3 and working out how to make the MSBuild PowerShell extension work with it, as apparently MS moved the DAC API to a different namespace in the latest version of the library. PowerShell would give direct access to the API, but may require quite a bit of boilerplate. Use the sample DAC Framework Client Side Tools, which requires compiling them myself, as the downloads available from Codeplex only include the Hosted version. It would also require fixing them to use DAC 3.0 classes as they appear to currently use an earlier version of DAC. I could then call these tools from an <Exec Command="" /> in the MSBuild script. Less boilerplate and if I hit any bumps in the road I can just make changes to the source. Processes Using whichever method, the process could be either: Export from on-premise SQL Server 2012 to local BACPAC Upload BACPAC to blog storage Import BACPAC to SQL Azure via Hosted DAC Or: Export from on-premise SQL Server 2012 to local BACPAC Import BACPAC to SQL Azure via Client DAC Question All of the above seems to be quite a lot of effort for something that seems to be a standard feature... so before I start reinventing the wheel and documenting the results for all to see, is there something really obvious that I've missed here? Is there pre-written script that MS has released that I have not yet uncovered? There's an command in the GUI of SQL Server Management Studio 2012 that does EXACTLY what I'm trying to do (right click on local database, click "Tasks", click "Deploy Database to SQL Azure"). Surely if it's a few clicks in the GUI it must be a single command on the command line somewhere??

    Read the article

  • Extension icons in Chrome for Mac have disappeared

    - by Seth Williamson
    On my new MacBook Pro running Snow Leopard 10.6.2 and Google Chrome for Mac, 5.0.307.11 beta, some (but not all) of the icons for extensions have disappeared. I can tell SOMETHING is still there, because the space is occupied and you can see it indent as you mouse over it. You can also see the name of the extension pop up in a balloon below. But the extension icon is invisible and the extension itself doesn't work. Right now it's happened with Google Translate, the show-in-IE extension, Wikipedia Chromium, Send with GMail and Clip to Evernote. The LastPass and Feedly extension icons are still visible. Any ideas on how to get them back and stop this from happening again? Seth Williamson

    Read the article

  • How to install chrome autosave extension?

    - by Oguz Can Sertel
    I would like to install chrome autosave plugin on ubuntu. when I try to install it with these steps https://github.com/NV/chrome-devtools-autosave-server , I got some errors... there was not installed node and npm out of box on ubuntu 12.10. So I installed npm and node with these commands. sudo apt-get install npm sudo apt-get install node and I tried to install autosave here is the output: sudo npm install -g autosave npm http GET https://registry.npmjs.org/autosave npm http 304 https://registry.npmjs.org/autosave npm http GET https://registry.npmjs.org/commander npm http 304 https://registry.npmjs.org/commander /usr/local/bin/autosave -> /usr/local/lib/node_modules/autosave/bin/autosave > [email protected] install /usr/local/lib/node_modules/autosave > node ./scripts/install.js npm ERR! error installing [email protected] npm WARN This failure might be due to the use of legacy binary "node" npm WARN For further explanations, please read npm WARN /usr/share/doc/nodejs/README.Debian npm WARN npm ERR! [email protected] install: `node ./scripts/install.js` npm ERR! `sh "-c" "node ./scripts/install.js"` failed with 1 npm ERR! npm ERR! Failed at the [email protected] install script. npm ERR! This is most likely a problem with the autosave package, npm ERR! not with npm itself. npm ERR! Tell the author that this fails on your system: npm ERR! node ./scripts/install.js npm ERR! You can get their info via: npm ERR! npm owner ls autosave npm ERR! There is likely additional logging output above. npm ERR! npm ERR! System Linux 3.5.0-17-generic npm ERR! command "/usr/bin/nodejs" "/usr/bin/npm" "install" "-g" "autosave" npm ERR! cwd /home/naczu npm ERR! node -v v0.6.19 npm ERR! npm -v 1.1.4 npm ERR! code ELIFECYCLE npm ERR! message [email protected] install: `node ./scripts/install.js` npm ERR! message `sh "-c" "node ./scripts/install.js"` failed with 1 npm ERR! errno {} npm ERR! npm ERR! Additional logging details can be found in: npm ERR! /home/naczu/npm-debug.log npm not ok and here is README.debian nodejs for Debian ================= packaged modules ---------------- The global search path for modules is /usr/lib/nodejs Future packages of node modules will use that directory, so it should be used wisely. user modules ------------ Node looks for modules in ./node_modules directory first; please read node#modules documentation carefully for more information. Node does not look for modules in /usr/local/lib/node_modules, where npm put them. Please read npm-link(1) of npm package, to understand how to properly use npm-installed modules in a project. Note that require.paths is not supported in future node versions. See also node(1) for more information about NODE_PATH. nodejs command -------------- The upstream name for the Node.js interpreter command is "node". In Debian the interpreter command has been changed to "nodejs". This was done to prevent a namespace collision: other commands use the same name in their upstreams, such as ax25-node from the "node" package. Scripts calling Node.js as a shell command must be changed to instead use the "nodejs" command.

    Read the article

  • Having trouble running code analysis from command prompt with msbuild

    - by devlife
    I'm using VS2010 RC while targeting .NET 3.5. I can run code analysis via Visual Studio without a problem. However, when I try to run code analysis on our CI server it isn't getting executed. When I attempt to build using msbuild 4.0 I get the following exception: C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\CodeAnalysis\Microsoft.CodeAnalysis.targets(129,9): error MSB4018: The "CodeAnalysis" task failed unexpectedly. C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\CodeAnalysis\Microsoft.CodeAnalysis.targets(129,9): error MSB4018: System.TypeLoadException: Could not load type 'System.Runtime.Versioning.TargetFrameworkAttribute' from assembly 'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089 Like I said, it works fine when I run it through VS.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >