Search Results

Search found 20122 results on 805 pages for 'build definition'.

Page 48/805 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Having 'mvn deploy' in Hudson's build goals and the standard approach of releasing

    - by user68759
    I set up Hudson for my project with the build goals mvn clean deploy site:site, run a build every midnight and whenever there are new changes. One thing I have been wondering is whether I should include deploy in the build goals because it could happen that if I had just released version 1.0.0 of my project (I've changed the pom to be version 1.0.0 and committed it) but not yet increased the version number to 1.0.1-SNAPSHOT for several days, I could end up with multiple different 1.0.0 builds being deployed at different times. But I've seen people are using deploy in their Hudson's build goals - I wonder how they deal with this issue. What's the correct way of doing a release with Maven actually? Thanks for any pointers!

    Read the article

  • delete the 'target' directory after build

    - by pstanton
    hi, i know this is probably frowned upon by maven lovers, but the whole 'target' directory is a waste of space in the context of our program and it's deployment. we have other build processes responsible for creating the actual deployment and i currently manually delete the target dir after every maven build, so that its contents don't interfere with my file searches etc... is there a way to delete this dir automatically at the end of a maven build/install? thanks, p.

    Read the article

  • How to make the eclipse IDE to build faster

    - by Solitaire
    Hi all.. i am using Eclipse IDE for developmental purpose, IDE is taking too much time to build, it gets hangs up, when the percentage of Build reaches to 78. it shows refreshing workspace several times.. it eats up lots of time.. please tell me how to make it to disable the unwanted "refreshing workspace" and other time consuming activities, and make the build faster. Thanks

    Read the article

  • How to build Linux system from kernel to UI layer

    - by mohit
    Hi, I have been looking into MeeGo, maemo, Android architecture. They all have Linux Kernel, build some libraries on it, then build middle layer libraries [e.g telephony, media etc...]. Suppose i wana build my own system, say Linux Kernel, with some binariers like glibc, Dbus,.... UI toolkit like GTK+ and its binaries. I want to compile every project from source to customize my own linux system for desktop, netbook and handheld devices. [starting from netbook first :)] How can i build my own customize system from kernel to UI.

    Read the article

  • OutputPath ignored on projects being build by TFS 2010

    - by bovium
    I have installed TFS2010 Beta 2 with default settings and configured a CI build with a solution containing the indivial projects. My *.cspoj files could have: <OutputPath>bin\debug\</OutputPath> Or alternatively: <OutDir>bin\debug\</OutDir> When the build server is done building and running tests etc. all the assemblies are placed in the root of the build drop off folder. How do I configure the build to keep the outputpath or outdir in my projects and store the assemblies and content in the matching folder structure( builddropfolder\bin\debug\ )? I have found a number of post on this most of them relates to TFS 2008 but I have not found solutions for TFS 2010. Perhaps it is possible to solve this in the new workflow file for the buildserver?

    Read the article

  • Set Hudson build number from a script

    - by Joe Schneider
    Is there a way to set the next build number in Hudson from a script? I have the nextBuildNumber plug-in installed, and attempted to use wget with --post-data, but that page appears to require login. I have two steps of a chained build and I want to keep the build numbers in sync.

    Read the article

  • TeamCity's build agent unable to find PowerShell

    - by cincura.net
    I have TC's build agent installed on Windows 2008 R2 SP1 Core. The server has PowerShell 2.0 installed (double checked, and actually from PS downloaded the TC installation). Looking at some build configuration I see these being incompatible with this agent, because powershell_x86/powershell_x64 is required. I tried deleting build agents dirs to force upgrade, but no luck. Interestingly if I provide powershell_x86, powershell_x86_Path (and for 64bit) variables into config file manually, everything runs fine. Is there anything I can do to have the build agent find PowerShell automatically? What/where is it looking for it? Maybe the 'Core' is problem.

    Read the article

  • NullReferenceException at Microsoft.Silverlight.Build.Tasks.CompileXaml.LoadAssemblies(ITaskItem[] R

    - by Eugene Larchick
    Hi, I updated my Visual Studio 2010 to the version 10.0.30319.1 RTM Rel and start getting the following exception during the build: System.NullReferenceException: Object reference not set to an instance of an object. at Microsoft.Silverlight.Build.Tasks.CompileXaml.LoadAssemblies(ITaskItem[] ReferenceAssemblies) at Microsoft.Silverlight.Build.Tasks.CompileXaml.get_GetXamlSchemaContext() at Microsoft.Silverlight.Build.Tasks.CompileXaml.GenerateCode(ITaskItem item, Boolean isApplication) at Microsoft.Silverlight.Build.Tasks.CompileXaml.Execute() at Bohr.Silverlight.BuildTasks.BohrCompileXaml.Execute() The code of BohrCompileXaml.Execute is the following: public override bool Execute() { List<TaskItem> pages = new List<TaskItem>(); foreach (ITaskItem item in SilverlightPages) { string newFileName = getGeneratedName(item.ItemSpec); String content = File.ReadAllText(item.ItemSpec); String parentClassName = getParentClassName(content); if (null != parentClassName) { content = content.Replace("<UserControl", "<" + parentClassName); content = content.Replace("</UserControl>", "</" + parentClassName + ">"); content = content.Replace("bohr:ParentClass=\"" + parentClassName + "\"", ""); } File.WriteAllText(newFileName, content); pages.Add(new TaskItem(newFileName)); } if (null != SilverlightApplications) { foreach (ITaskItem item in SilverlightApplications) { Log.LogMessage(MessageImportance.High, "Application: " + item.ToString()); } } foreach (ITaskItem item in pages) { Log.LogMessage(MessageImportance.High, "newPage: " + item.ToString()); } CompileXaml xamlCompiler = new CompileXaml(); xamlCompiler.AssemblyName = AssemblyName; xamlCompiler.Language = Language; xamlCompiler.LanguageSourceExtension = LanguageSourceExtension; xamlCompiler.OutputPath = OutputPath; xamlCompiler.ProjectPath = ProjectPath; xamlCompiler.RootNamespace = RootNamespace; xamlCompiler.SilverlightApplications = SilverlightApplications; xamlCompiler.SilverlightPages = pages.ToArray(); xamlCompiler.TargetFrameworkDirectory = TargetFrameworkDirectory; xamlCompiler.TargetFrameworkSDKDirectory = TargetFrameworkSDKDirectory; xamlCompiler.BuildEngine = BuildEngine; bool result = xamlCompiler.Execute(); // HERE we got the error! And the definition of the task: <BohrCompileXaml LanguageSourceExtension="$(DefaultLanguageSourceExtension)" Language="$(Language)" SilverlightPages="@(Page)" SilverlightApplications="@(ApplicationDefinition)" ProjectPath="$(MSBuildProjectFullPath)" RootNamespace="$(RootNamespace)" AssemblyName="$(AssemblyName)" OutputPath="$(IntermediateOutputPath)" TargetFrameworkDirectory="$(TargetFrameworkDirectory)" TargetFrameworkSDKDirectory="$(TargetFrameworkSDKDirectory)" > <Output ItemName="Compile" TaskParameter="GeneratedCodeFiles" /> <!-- Add to the list list of files written. It is used in Microsoft.Common.Targets to clean up for a next clean build --> <Output ItemName="FileWrites" TaskParameter="WrittenFiles" /> <Output ItemName="_GeneratedCodeFiles" TaskParameter="GeneratedCodeFiles" /> </BohrCompileXaml> What can be the reason? And how can I get more info what's happening inside CompileXaml class?

    Read the article

  • Manage build target variables in iphone project

    - by DougW
    We're transitioning to an automated build process for our iphone projects. These projects can be checked out by individual devs, in which case all the API URLs need to point to a certain path. There are also a variety of build environments, each with their own API root paths. I could probably add multiple, different build targets, and have each of them include a different URLs definition file, but this seems like a lot of upkeep and a bit overkill. Any best practices out there for swapping a few environmental variables for different build environments without much fuss?

    Read the article

  • Visual Studio build and deploy ordering

    - by mthornal
    We have a VS 2010 solution that includes a few class library projects, a SQL Server 2008 database project and a Wix setup project. We are trying to get to a point where the following happens in the order specified: Build the class library projects and the database project Deploy the database project to generate the deploy .sql script Build the Wix setup project. The reason for the desired order is that the setup project requires the deployment .sql scripts as it will use these to generate/update the database on the machine that the msi is run. It seems that there is no way within a Visual Studio solution file to create this type of build/deploy/build order. Is this correct? Thanks

    Read the article

  • Build model with nested model in rspec integration test

    - by user1116573
    I understand that I can do something like in rspec: let(:project) { Project.new } but in my app a project accepts_nested_attributes_for tasks and when I generate the Project form I build a task along with it using: @project = Project.new @project.tasks.build I need something like: let(:project) { Project.new.tasks.build } but that doesn't seem to work. How can I do this as a let in my rspec test?

    Read the article

  • using makefile targets to set build options

    - by leo grrr
    This is either trivial or runs counter to the philosophy of how make should be used, but I'd like to have a command line that reads as "make debug" rather than "make DEBUG=1". I tried creating a phony target called debug that did nothing except set the DEBUG variable, but then there was a difference between "make debug build" and "make build debug"--namely that in one case, the variable got set after the build happened. Is there a way to give certain targets precedence? Thanks for your help.

    Read the article

  • Getting \ building latest NHibernate build

    - by Alex Yakunin
    I'd like to get the latest NHibernate build or build it by my own. The build available at SourceForge is dated by Nov 2009, although I see there was a lot of activity later, especially related to LINQ development. So what is the best option? I can: Get the latest source code and try to build it. Are there any instructions for this? Get one of latest builds shared by someone else. Are there any people maintaining such builds? Please note, that I'm not interested in 8-month old builds - I need the latest code for tests (LINQ, performance). I know there is a similar question, but it looks like top answers there are outdated.

    Read the article

  • Are SharePoint site templates really less performant than site definitions?

    - by Jim
    So, it seems in the SharePoint blogosphere that everybody just copies and pastes the same bullet points from other blogs. One bullet point I've seen is that SharePoint site templates are less performant than site definitions because site definitions are stored on the file system. Is that true? It seems odd that site templates would be less performant. It's my understanding that all site content lives in a database, whether you use a site template or a site definition. A site template is applied once to the database, and from then on the site should not care if the content was created using a site template or not. So, does anybody have an architectural reason why a site template would be less performant than a site definition? Edit: Links to the blogs that say there is a performance difference: From MSDN: Because it is slow to store templates in and retrieve them from the database, site templates can result in slower performance. From DevX: However, user templates in SharePoint can lead to performance problems and may not be the best approach if you're trying to create a set of reusable templates for an entire organization. From IT Footprint: Because it is slow to store templates in and retrieve them from the database, site templates can result in slower performance. Templates in the database are compiled and executed every time a page is rendered. From Branding SharePoint:Custom site definitions hold the following advantages over custom templates: Data is stored directly on the Web servers, so performance is typically better. At a minimum, I think the above articles are incomplete, and I think several are misleading based on what I know of SharePoints architecture. I read another blog post that argued against the performance differences, but I can't find the link.

    Read the article

  • What is the formal definition of a meta package?

    - by kojiro
    There are several examples of packaging where an application package is built, named, described, even licensed, but contains only setup code and dependencies -- it has no first-class runtime software of its own. I would call this "meta-packaging". This seems to be particularly popular in the open-source world, including examples like kde-meta (Gentoo Portage), Plone, and I'm sure lots of others. I can see how it's a useful practice, but despite it existing as a practice, I couldn't find a formal definition of either "meta-packaging" or "meta-egg" (Python) in searching the web. Is that not the correct term? If it is, is it such common-sense that it needs no formal definition? If not, what is the correct way to put it?

    Read the article

  • configuring uppercut for automated build

    - by deepasun
    This is my cc.net's config file. http://confluence.public.thoughtworks.org/display/CCNET/Configuration+Preprocessor -- -- -- <!-- PROJECT STRUCTURE --> <cb:define name="WindowsFormsApplication1"> <project name="$(projectName)"> <workingDirectory>$(working_directory)\$(projectName)</workingDirectory> <artifactDirectory>$(drop_directory)\$(projectName)</artifactDirectory> <category>$(projectName)</category> <queuePriority>$(queuePriority)</queuePriority> <triggers> <intervalTrigger name="continuous" seconds="60" buildCondition="IfModificationExists" /> </triggers> <sourcecontrol type="svn"> <executable>c:\program files\subversion\bin\svn.exe</executable> <!--<trunkUrl>http://192.168.1.8/trainingrepos/deepasundari/WindowsFormsApplication1</trunkUrl>--> <trunkUrl>$(svnPath)</trunkUrl> <workingDirectory>$(working_directory)\$(projectName)</workingDirectory> </sourcecontrol> <tasks> <exec> <executable>$(working_directory)\$(projectName)\build.bat</executable> </exec> </tasks> <publishers> <merge> <files> <file>$(working_directory)\$(projectName)\build_output\build_artifacts\*.xml</file> <file>$(working_directory)\$(projectName)\build_output\build_artifacts\mbunit\*-results.xml</file> <file>$(working_directory)\$(projectName)\build_output\build_artifacts\nunit\*-results.xml</file> <file>$(working_directory)\$(projectName)\build_output\build_artifacts\ncover\*-results.xml</file> <file>$(working_directory)\$(projectName)\build_output\build_artifacts\ndepend\*.xml</file> </files> </merge> <!--<email from="[email protected]" mailhost="smtp.somewhere.com" includeDetails="TRUE"> <users> <user name="YOUR NAME" group="BuildNotice" address="[email protected]" /> </users> <groups> <group name="BuildNotice" notification="change" /> </groups> </email>--> <xmllogger/> <statistics> <statisticList> <firstMatch name="Svn Revision" xpath="//modifications/modification/changeNumber" /> <firstMatch name="ILInstructions" xpath="//ApplicationMetrics/@NILInstruction" /> <firstMatch name="LinesOfCode" xpath="//ApplicationMetrics/@NbLinesOfCode" /> <firstMatch name="LinesOfComment" xpath="//ApplicationMetrics/@NbLinesOfComment" /> </statisticList> </statistics> <modificationHistory onlyLogWhenChangesFound="true" /> <rss/> </publishers> </project> </cb:define> <cb:WindowsFormsApplication1 projectname="WindowsFormsApplication1" queuepriority="80" svnpath="http://192.168.1.8/trainingrepos/deepasundari/WindowsFormsApplication1" /> It is not producing the build directory in code_drop, but updating reports.xml with updated build.. wht is the problem?

    Read the article

  • Embedding mercurial revision information in Visual Studio c# projects automatically

    - by Mark Booth
    Original Problem In building our projects, I want the mercurial id of each repository to be embedded within the product(s) of that repository (the library, application or test application). I find it makes it so much easier to debug an application ebing run by custiomers 8 timezones away if you know precisely what went into building the particular version of the application they are using. As such, every project (application or library) in our systems implement a way of getting at the associated revision information. I also find it very useful to be able to see if an application has been compiled with clean (un-modified) changesets from the repository. 'Hg id' usefully appends a + to the changeset id when there are uncommitted changes in a repository, so this allows is to easily see if people are running a clean or a modified version of the code. My current solution is detailed below, and fulfills the basic requirements, but there are a number of problems with it. Current Solution At the moment, to each and every Visual Studio solution, I add the following "Pre-build event command line" commands: cd $(ProjectDir) HgID I also add an HgID.bat file to the Project directory: @echo off type HgId.pre > HgId.cs For /F "delims=" %%a in ('hg id') Do <nul >>HgID.cs set /p = @"%%a" echo ; >> HgId.cs echo } >> HgId.cs echo } >> HgId.cs along with an HgId.pre file, which is defined as: namespace My.Namespace { /// <summary> Auto generated Mercurial ID class. </summary> internal class HgID { /// <summary> Mercurial version ID [+ is modified] [Named branch]</summary> public const string Version = When I build my application, the pre-build event is triggered on all libraries, creating a new HgId.cs file (which is not kept under revision control) and causing the library to be re-compiled with with the new 'hg id' string in 'Version'. Problems with the current solution The main problem is that since the HgId.cs is re-created at each pre-build, every time we need to compile anything, all projects in the current solution are re-compiled. Since we want to be able to easily debug into our libraries, we usually keep many libraries referenced in our main application solution. This can result in build times which are significantly longer than I would like. Ideally I would like the libraries to compile only if the contents of the HgId.cs file has actually changed, as opposed to having been re-created with exactly the same contents. The second problem with this method is it's dependence on specific behaviour of the windows shell. I've already had to modify the batch file several times, since the original worked under XP but not Vista, the next version worked under Vista but not XP and finally I managed to make it work with both. Whether it will work with Windows 7 however is anyones guess and as time goes on, I see it more likely that contractors will expect to be able to build our apps on their Windows 7 boxen. Finally, I have an aesthetic problem with this solution, batch files and bodged together template files feel like the wrong way to do this. My actual questions How would you solve/how are you solving the problem I'm trying to solve? What better options are out there than what I'm currently doing? Rejected Solutions to these problems Before I implemented the current solution, I looked at Mercurials Keyword extension, since it seemed like the obvious solution. However the more I looked at it and read peoples opinions, the more that I came to the conclusion that it wasn't the right thing to do. I also remember the problems that keyword substitution has caused me in projects at previous companies (just the thought of ever having to use Source Safe again fills me with a feeling of dread *8'). Also, I don't particularly want to have to enable Mercurial extensions to get the build to complete. I want the solution to be self contained, so that it isn't easy for the application to be accidentally compiled without the embedded version information just because an extension isn't enabled or the right helper software hasn't been installed. I also thought of writing this in a better scripting language, one where I would only write HgId.cs file if the content had actually changed, but all of the options I could think of would require my co-workers, contractors and possibly customers to have to install software they might not otherwise want (for example cygwin). Any other options people can think of would be appreciated. Update Partial solution Having played around with it for a while, I've managed to get the HgId.bat file to only overwrite the HgId.cs file if it changes: @echo off type HgId.pre > HgId.cst For /F "delims=" %%a in ('hg id') Do <nul >>HgId.cst set /p = @"%%a" echo ; >> HgId.cst echo } >> HgId.cst echo } >> HgId.cst fc HgId.cs HgId.cst >NUL if %errorlevel%==0 goto :ok copy HgId.cst HgId.cs :ok del HgId.cst Problems with this solution Even though HgId.cs is no longer being re-created every time, Visual Studio still insists on compiling everything every time. I've tried looking for solutions and tried checking "Only build startup projects and dependencies on Run" in Tools|Options|Projects and Solutions|Build and Run but it makes no difference. The second problem also remains, and now I have no way to test if it will work with Vista, since that contractor is no longer with us. If anyone can test this batch file on a Windows 7 and/or Vista box, I would appreciate hearing how it went. Finally, my aesthetic problem with this solution, is even strnger than it was before, since the batch file is more complex and this there is now more to go wrong. If you can think of any better solution, I would love to hear about them.

    Read the article

  • Where is definitive download location MBSA's "wsusscn2.cab" file for offline mode scans?

    - by Chris W. Rea
    I'm running Microsoft Baseline Security Analyzer 2.1 against some servers that don't have outbound access to the Internet, by design of firewall restrictions, and therefore I'm wishing to run MBSA in offline mode. In order to do so, I need the list of updates in the file named "wsusscn2.cab". Is there a well-known page or URL at Microsoft for downloading the most up-to-date version of that file for MBSA offline mode? Thank you.

    Read the article

  • rpm build from src file

    - by danielrutledge
    Hi all, I'm trying to build from a *.src.rpm file on FC 12 in such a way that the files are distributed a across my system as they would with a normal binary build (in this case, *.h files end up in /usr/include). When I ran rpmbuild, the headers weren't present. Here's my rpmbuild command: [root@localhost sphirewalld]# rpm -ivv /home/dan/Downloads/gtest-1.3.0-2.20090601svn257.fc12.src.rpm ============== /home/dan/Downloads/gtest-1.3.0-2.20090601svn257.fc12.src.rpm Expected size: 489395 = lead(96)+sigs(180)+pad(4)+data(489115) Actual size: 489395 loading keyring from pubkeys in /var/lib/rpm/pubkeys/*.key couldn't find any keys in /var/lib/rpm/pubkeys/*.key loading keyring from rpmdb opening db environment /var/lib/rpm/Packages cdb:mpool:joinenv opening db index /var/lib/rpm/Packages rdonly mode=0x0 locked db index /var/lib/rpm/Packages opening db index /var/lib/rpm/Name rdonly mode=0x0 read h# 931 Header sanity check: OK added key gpg-pubkey-57bbccba-4a6f97af to keyring read h# 1327 Header sanity check: OK added key gpg-pubkey-7fac5991-4615767f to keyring read h# 1420 Header sanity check: OK added key gpg-pubkey-16ca1a56-4a100959 to keyring read h# 1896 Header sanity check: OK added key gpg-pubkey-a3a882c1-4a1009ef to keyring Using legacy gpg-pubkey(s) from rpmdb /home/dan/Downloads/gtest-1.3.0-2.20090601svn257.fc12.src.rpm: Header SHA1 digest: OK (3e98ed9b1631395d417e00f35c83ebe588ea9d3b) added source package [0] found 1 source and 0 binary packages Expected size: 489395 = lead(96)+sigs(180)+pad(4)+data(489115) Actual size: 489395 InstallSourcePackage at: psm.c:232: Header SHA1 digest: OK (3e98ed9b1631395d417e00f35c83ebe588ea9d3b) gtest-1.3.0-2.20090601svn257.fc12 ========== Directories not explicitly included in package: 0 /root/rpmbuild/SOURCES/ 1 /root/rpmbuild/SPECS/ ========== warning: user mockbuild does not exist - using root warning: group mockbuild does not exist - using root fini 100664 1 ( 0, 0) 478034 /root/rpmbuild/SOURCES/gtest-1.3.0.tar.bz2;4ba93ce1 unknown warning: user mockbuild does not exist - using root warning: group mockbuild does not exist - using root fini 100644 1 ( 0, 0) 30505 /root/rpmbuild/SOURCES/gtest-svnr257.patch;4ba93ce1 unknown warning: user mockbuild does not exist - using root warning: group mockbuild does not exist - using root fini 100644 1 ( 0, 0) 2732 /root/rpmbuild/SPECS/gtest.spec;4ba93ce1 unknown GZDIO: 63 reads, 511788 total bytes in 0.005930 secs closed db index /var/lib/rpm/Name closed db index /var/lib/rpm/Packages closed db environment /var/lib/rpm/Packages Thanks for your help.

    Read the article

  • HD Video Capture Card w/ Good API?

    - by Sheep Slapper
    Does anyone here know of a good HD video capture card that has a good (comprehensive) API? I administer a few servers that do some video encoding right now, but when we make the switch to HD cameras, they won't be sufficient. In addition to this, the servers we have now are black boxes, closed to me except to start/stop the video capture device. I'd like to be able to roll my own, so we can better integrate it with our existing systems, but I know almost nothing about what kind of HD capture cards are out there, and if I can avoid spending money just to test their APIs that would rock. So does anyone have any experience with this? All our other software is in C#, and I'd like to set up the new servers with web interfaces to start/stop the capture (also in C#, using .NET 3.5 probably). I'm not sure how language specific these APIs would be, but that's what I'm working with just as a reference point. I appreciate any help the community can give!

    Read the article

  • How do I replace the screen of a Dell Ispiron 1545?

    - by Ajus10
    I got a new screen for a Dell Inspiron 1545. The old screen says Dell Inspiron 1545 LP156WH1 (TL)(C1?) HD and the new one says Dell Inspiron 1545 LP156WH1 (TL)(C1?) LCD Does that make a difference? All I can get to work on the new screen is the backlight. The old screen had a crack. Now when I plug the old one in, it will not turn on at all. Could I have blown the inverter or messed up the cable?

    Read the article

  • PSU requirement question for my PC setup.

    - by user69474
    I understand that sometimes there may be a situation where the PSU is way more than required but in this case of mine, I'm not too sure. Sometimes when I play games, my computer will crash and restarts itself, 10 mins into the game. Once I received a message that says something like the power is overheating or something like that. Ok, so I have a 500W PSU. I have: 1x Internal DVD writer 1x SATA 250GB HD 1x Nvidia 8500 GT 2GB RAM. As I'm planning to get an additional 250GB SATA HD, I wonder if I need to increase my PSU as well -- in full knowledge of the previous crashes experienced before. Should I upgrade my PSU to 650W perhaps, or is that excessive?

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >