Search Results

Search found 18239 results on 730 pages for 'build triggers'.

Page 74/730 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • How can I use an Ant foreach iteration with values from a file?

    - by Egon Willighagen
    In our Ant build environment, I have to do the same task for a number of items. The AntContrib foreach task is useful for that. However, the list is in a parameter, where I actually have the list in a file. How can I iterate over items in a file in an foreach-like way in Ant? Something like (pseudo-code): <foreach target="compile-module" listFromFile="$fileWithModules"/> I'm happy to write a custom Task, and welcome any suggestion on possible solutions.

    Read the article

  • Visual Studio 2010 Publish Web feature not including all DLLs

    - by manu08
    I have an ASP.NET MVC 2 application. Web project contains a reference to SomeProject SomeProject contains references to ExternalAssembly1 and ExternalAssembly2. SomeProject explicitly calls into ExternalAssembly1, but NOT ExternalAssembly2. ExternalAssembly1 calls into ExternalAssembly2 When I perform a local build everything is cool. All DLLs are included in the bin\debug folder. The problem is that when I use the Publish Web command in Visual Studio 2010, it deploys everything except ExternalAssembly2. It appears to ignore assemblies that aren't directly used (remember, ExternalAssembly2 is only used by ExternalAssembly1). Is there any way I can tell Visual Studio 2010 to include ExternalAssembly2?

    Read the article

  • Building elf within Eclipse within Windows

    - by BSchlinker
    Hey guys, I'm having trouble building an Elf file within Eclipse within Windows. It seems that everytime I build, a PE / portable executable for windows is created. I've gone into the Binary Parser section and checked Elf Parser while making sure that everything else is unchecked. However, I continue to end up with a PE which I cannot run on Linux. For clarification, I'm using the Linux GCC toolchain within Eclipse. I've attempted a reinstall of Cygwin -- still experiencing the same issues. Any ideas? Thanks

    Read the article

  • How to get the git commit count?

    - by Splo
    I'd like to get the number of commits of my git repository, a bit like SVN revision numbers. The goal is to use it as a unique, incrementing build number. I currently do like that, on Unix/Cygwin/msysGit: git log --pretty=format:'' | wc -l But I feel it's a bit of a hack. Is there a better way to do that? It would be cool if I actually didn't need wc or even git, so it could work on a bare Windows. Just read a file or a directory structure ...

    Read the article

  • SVN Externals in a different SCM

    - by Sean Chambers
    At a previous workplace we used svn externals to update dependent projects when a shared component was updated. This made it easy to see anything that those changes broke, as well as update dependent projects to the latest version of a shared component automatically without any intervention. At a new workplace we are using cc.net with surround scm and I'm trying to find something similar in surround. I haven't found anything like externals, only "shared files", but unlike externals, the shared files doesn't allow you to point at a specific revision of a file for the external. I'm interested in what other people are doing in these scenarios to lean on their continuous integration and treat it more for integration than a "continuous build" server. Does anyone know of a tool or something to do "externals" behavior without using svn? I suppose having an xml registry file of which projects depend on which assemblies and if they should be using the latest version but this seems like overkill.

    Read the article

  • Fixing lots of broken references in a working asp.net mvc project

    - by davidbuttrick
    The last time I worked on this project everything was fine. That was about 4 days ago. Now, when I open the project, all the references to .Net are not working, I cannot build my project any more. I have tried following the advice in posts here, but to no avail. Even simple things, like Request.cookies - Request is underlined in curlies, and I get 'Request is undefined' when I roll over it. That doesnt seem like I need to just remove and recreate the reference to System.Web.Mvc - which I have tried, and had no luck. Any ideas? Surely there are other issues that can cause this problem... Thank you.

    Read the article

  • Multi-process builds in Visual Studio 2010: Worth it?

    - by coryr
    I've started testing our C++ software with VS2010 and the build times are really bad (30-45 minutes, about double the VS2005 times). I've been reading about the /MP switch for multi-process compilation. Unfortunately, it is incompatible with some features that we use quite a bit like #import, incremental compilation, and precompiled headers. Have you had a similar project where you tried the /MP switch after turning off things like precompiled headers? Did you get faster builds? My machine is running 64-bit Windows 7 on a 4 core machine with 4 GB of RAM and a fast SSD storage. Virus scanner disabled and a pretty minimal software environment.

    Read the article

  • Bulding an multi-platform SWT application using Ant

    - by Mridang Agarwalla
    I'm writing an SWT application which can be used on Windows (32/64 bit) and Mac OSX (32/64 bit). Apart from the JRE I rely on the SWT library found here. I can find four versions of the SWT library depending upon my target platforms (as mentioned above). When building my application, how can I compile using the correct SWT Jar? If possible, I'd like to try and avoid hard-coding the Jar version, platform and architecture. The SWT Jars are named like this: swt-win32-x86_64.jar swt-win32-x86_32.jar swt-macosx-x86_32.jar swt-macosx-x86_64.jar (My project will be an open source project. I'd like people to be able to download the source and build it and therefore I've thought of including all the four versions of the SWT Jars in the source distribution. I hope this is the correct approach of publishing code relying on third-part libraries.) Thanks everyone.

    Read the article

  • Ant Tokenizer: Selecting an individual Token

    - by John Oxley
    I have the following ant task: <loadfile property="proj.version" srcfile="build.py"> <filterchain> <striplinecomments> <comment value="#"/> </striplinecomments> <linecontains> <contains value="Version" /> </linecontains> </filterchain> </loadfile> <echo message="${proj.version}" /> And the output is [echo] config ["Version"] = "v1.0.10-r4.2" How do I then use a tokenizer to get only v1.0.10-r4.2, the equivalent of | cut -d'"' -f4

    Read the article

  • Qt compilation and stylesheet

    - by Yosko
    Each time I compile my Qt project after modifying my qss stylesheet file, the modifications aren't taken into account, unless I rebuild everything. Any idea on a workaround for this, so that I don't have to wait 5 minutes each time I change my qss ? Notes: I use Qt 4.8, and my stylsheet is declared in a resource file (qrc). EDIT: As suggested by Luca Carlon, when a qss is reference in the project through a .qrc file, the changes in the qss don't affect the qrc, and the compiler ignores it. To avoid that, I added a Custom Build Step to my project: before the qmake step! calls a .bat file without any argument the .bat contains the real command copy /b files.qrc +,,

    Read the article

  • WPF application in obj directory doesn't work.

    - by juharr
    When I build my WPF application the exe that ends up in the bin directory works just fine, but the one in the obj directory does not. When I Debug the exe from the obj directory I get the following exception: TypeInitializationException was unhandled: The type initializer for 'MyProject.App' threw an exception. So basically I'm wondering why the obj exe doesn't work while the bin one does (I was under the assumption that the obj exe was just copied to the bin) and how to fix it. The reason that I even care is because I'm using Wix to create a MSI for my application and I have a Votive project setup that uses var.MyProject.TargetPath which points to the exe in the obj directory.

    Read the article

  • Visual Studio namespace errors after deleting userControls

    - by msfanboy
    Really Visual Studio can be so annoying sometimes... I did nothing else than deleting 3 UserControls in a folder. Since that time I get a error message I do not get rid of. Whatever I do I can not build successfully my project. I did not touch the SchoolAdministrationUC.xaml file , but I deleted 3 other UserControls also located in the path: TBM\View\SchoolclassAdministration\ Error message from VS: Error 1 The type or namespacename "SchoolclassAdministration" is in namespace "TBM.View" not available. (missing assembly reference?) E:\TBM\obj\x86\Debug\View\SchoolclassAdministration\SchoolAdministrationUC.g.cs 33 16 TBM How do I get rid of error ?

    Read the article

  • How to program three editions Light, Pro, Ultimate in one solution

    - by Henry99
    I'd like to know how best to program three different editions of my C# ASP.NET 3.5 application in VS2008 Professional (which includes a web deployment project). I have a Light, Pro and Ultimate edition (or version) of my application. At the moment I've put all in one solution with three build versions in configuration manager and I use preprocessor directives all over the code (there are around 20 such constructs in some ten thousand lines of code, so it's overseeable): #if light //light code #endif #if pro //pro code #endif //etc... I've read in stackoverflow for hours and thought to encounter how e.g. Microsoft does this with its different Windows editions, but did not find what I expected. Somewhere there is a heavy discussion about if preprocessor directives are evil. What I like with those #if-directives is: the side-by-side code of differences, so I will understand the code for the different editions after six months and the special benefit to NOT give out compiled code of other versions to the customer. OK, long explication, repeated question: What's the best way to go?

    Read the article

  • frequent updates of a Tomcat application

    - by Erel Segal Halevi
    I have an application that runs on a Tomcat 7 server on a Windows machine. In its current stage, I have to frequently update and fix it. Whenever I need to update the application, I do all this: Build a new war file; Go to the Windows server, stop the Tomcat service; download the file, put it under webapps; Remove the old application folder under webapps; Remove the old application folder under work/Catalina/localhost (otherwise it keeps the old version cached). Restart the Tomcat service. I am sure there is a way to do all this automatically. What is it?

    Read the article

  • Junit Ant Task, output stack trace

    - by Benju
    I have a number of tests failing in the following JUnit Task. <target name="test-main" depends="build.modules" description="Main Integration/Unit tests"> <junit fork="yes" description="Main Integration/Unit Tests" showoutput="true" printsummary="true" outputtoformatters="true"> <classpath refid="test-main.runtime.classpath"/> <batchtest filtertrace="false" todir="${basedir}"> <fileset dir="${basedir}" includes="**/*Test.class" excludes="**/*MapSimulationTest.class"/> </batchtest> </junit> </target> How do I tell Junit to ouput the errors for each test so that I can look at the stack trace and debug the issues.

    Read the article

  • Building SL4 + RIAServices app takes too long on VS2010.

    - by adlanelm
    Got a Win7 box with VS2010 Premium installed on it. Building desktop apps works just fine. But we got this solution with 15 SL4 and 21 desktop projects... Building the SL part of it takes too long. This is very irritating and encourages to drop TDD since every time I run a test it takes ~3 seconds for msbuild to find out that nothing changed and the project should be skipped. The projects are very small and there's nothing fancy in them and we hadn't any problems before we switched from VS2008+SL3. I've heard people complaining abound VS2010 speed in general, but nothing about SL4 build time. Is anyone experiencing same problems and is there any workaround for this?

    Read the article

  • Data-only static libraries with GCC

    - by regularfry
    How can I make static libraries with only binary data, that is without any object code, and make that data available to a C program? Here's the build process and simplified code I'm trying to make work: ./datafile: abcdefghij Makefile: libdatafile.a: ar [magic] datafile main: libdatafile.a gcc main.c libdatafile.a -o main main.c: #define TEXTPTR [more magic] int main(){ char mystring[11]; memset(mystring, '\0', 11); memcpy(TEXTPTR, mystring, 10); puts(mystring); puts(mystring); return 0; } The output I'm expecting from running main is, of course: abcdefghijabcdefghij My question is: what should [magic] and [more magic] be?

    Read the article

  • Different versions in manifest on different machines

    - by Terry777
    Hi guys, Have two machines, both with VS2005 SP1 installed and with the WinSXS showing the same things installed. When one machine builds a particular C++ .dll .vcproj it ends up with <assemblyIdentity type='win32' name='Microsoft.VC80.MFC' version='8.0.50727.762' processorArchitecture='x86' publicKeyToken='1fc8b3b9a1e18e3b' /> in its manifest file. But on the other machine it ends up with <assemblyIdentity type='win32' name='Microsoft.VC80.MFC' version='8.0.50608.0 processorArchitecture='x86' publicKeyToken='1fc8b3b9a1e18e3b' /> even though this machine does not have '8.0.50608.0' libraries listed in its WinSXS. The .dll built on this machine with the older version referenced has some problems. I have ensured both machines have the same latest source code and references etc.. What could be causing it to build with the different reference? Thanks! Terry

    Read the article

  • I have an Xcode static library project, how do I add a test target to it so I can run it there? (Ins

    - by zekel
    I want to be able to test library code in the library target so I don't have to switch over to a separate project to run it. I see how to add a target, but I'm not sure how to set it up to run like the "Command Line Tool" project template does. I tried adding a new "Shell Tool" target, but I don't know how to make it run like one. What build settings do I have to add to that target? What files (main.m?) do I need to start it up?

    Read the article

  • Common files in output directories in a C# program

    - by Net Citizen
    My VS2008 solution has the following setup. Program1 Program2 Common.dll (used and referenced by both Program1 and Program2) In debug mode I like to set my output directory to Program Files\Productname, because some code will get the exe path for various reasons. My problem is that Program1 when compiled, will give an error that it could not copy Common.dll if Program2 is started. And vise versa. The annoyance here is that I don't even make changes to Common.dll that often, but 100% of the time it will try to copy it, not only when there are changes. I end up having to close all programs, and then build and then start them. So my question is, how can I only have VS2008 copy the Common.dll if there are changes inside the Common.dll project?

    Read the article

  • using different string files in android

    - by boreas
    I'm porting my iPhone app to android and I'm having a problem with the string files now. The app is a translation tool and users can switch the languages, so all the localized strings are in both languages and they are independent from what locale the OS is running. For iOS version I have different files like de.strings, en.strings and fr.strings and so on. For every target with specified language pair I read the strings from the string tables, e.g. for de-fr I will include de.strings and fr.strings in project and set the name of the string tables in the info-list file and read strings from them. In the end I have one project containing different targets (with different info-list files) and all are well configured. I'm intending to do the same on android platform, but Is only one strings.xml allowed per project? How do I set different build target? How do I specify per target which strings.xml it should read?

    Read the article

  • Continuous Integration for SQL Server Part II – Integration Testing

    - by Ben Rees
    My previous post, on setting up Continuous Integration for SQL Server databases using GitHub, Bamboo and Red Gate’s tools, covered the first two parts of a simple Database Continuous Delivery process: Putting your database in to a source control system, and, Running a continuous integration process, each time changes are checked in. However there is, of course, a lot more to to Continuous Delivery than that. Specifically, in addition to the above: Putting some actual integration tests in to the CI process (otherwise, they don’t really do much, do they!?), Deploying the database changes with a managed, automated approach, Monitoring what you’ve just put live, to make sure you haven’t broken anything. This post will detail how to set up a very simple pipeline for implementing the first of these (continuous integration testing). NB: A lot of the setup in this post is built on top of the configuration from before, so it might be difficult to implement this post without running through part I first. There’ll then be a third post on automated database deployment followed by a final post dealing with the last item – monitoring changes on the live system. In the previous post, I used a mixture of Red Gate products and other 3rd party software – GitHub and Atlassian Bamboo specifically. This was partly because I believe most people work in an heterogeneous environment, using software from different vendors to suit their purposes and I wanted to show how this could work for this process. For example, you could easily substitute Atlassian’s BitBucket or Stash for GitHub, depending on your needs, or use an alternative CI server such as TeamCity, TFS or Jenkins. However, in this, post, I’ll be mostly using Red Gate products only (other than tSQLt). I would do this, firstly because I work for Red Gate. However, I also think that in the area of Database Delivery processes, nobody else has the offerings to implement this process fully – so I didn’t have any choice!   Background on Continuous Delivery For me, a great source of information on what makes a proper Continuous Delivery process is the Jez Humble and David Farley classic: Continuous Delivery – Reliable Software Releases through Build, Test, and Deployment Automation This book is not of course, primarily about databases, and the process I outline here and in the previous article is a gross simplification of what Jez and David describe (not least because it’s that much harder for databases!). However, a lot of the principles that they describe can be equally applied to database development and, I would argue, should be. As I say however, what I describe here is a very simple version of what would be required for a full production process. A couple of useful resources on handling some of these complexities can be found in the following two references: Refactoring Databases – Evolutionary Database Design, by Scott J Ambler and Pramod J. Sadalage Versioning Databases – Branching and Merging, by Scott Allen In particular, I don’t deal at all with the issues of multiple branches and merging of those branches, an issue made particularly acute by the use of GitHub. The other point worth making is that, in the words of Martin Fowler: Continuous Delivery is about keeping your application in a state where it is always able to deploy into production.   I.e. we are not talking about continuously delivery updates to the production database every time someone checks in an amendment to a stored procedure. That is possible (and what Martin calls Continuous Deployment). However, again, that’s more than I describe in this article. And I doubt I need to remind DBAs or Developers to Proceed with Caution!   Integration Testing Back to something practical. The next stage, building on our set up from the previous article, is to add in some integration tests to the process. As I say, the CI process, though interesting, isn’t enormously useful without some sort of test process running. For this we’ll use the tSQLt framework, an open source framework designed specifically for running SQL Server tests. tSQLt is part of Red Gate’s SQL Test found on http://www.red-gate.com/products/sql-development/sql-test/ or can be downloaded separately from www.tsqlt.org - though I’ll provide a step-by-step guide below for setting this up. Getting tSQLt set up via SQL Test Click on the link http://www.red-gate.com/products/sql-development/sql-test/ and click on the blue Download button to download the Red Gate SQL Test product, if not already installed. Follow the install process for SQL Test to install the SQL Server Management Studio (SSMS) plugin on to your machine, if not already installed. Open SSMS. You should now see SQL Test under the Tools menu:   Clicking this link will give you the basic SQL Test dialogue: As yet, though we’ve installed the SQL Test product we haven’t yet installed the tSQLt test framework on to any particular database. To do this, we need to add our RedGateApp database using this dialogue, by clicking on the + Add Database to SQL Test… link, selecting the RedGateApp database and clicking the Add Database link:   In the next screen, SQL Test describes what will be installed on the database for the tSQLt framework. Also in this dialogue, uncheck the “Add SQL Cop tests” option (shown below). SQL Cop is a great set of pre-defined tests that work within the tSQLt framework to check the general health of your SQL Server database. However, we won’t be using them in this particular simple example: Once you’ve clicked on the OK button, the changes described in the dialogue will be made to your database. Some of these are shown in the left-hand-side below: We’ve now installed the framework. However, we haven’t actually created any tests, so this will be the next step. But, before we proceed, we’ve made an update to our database so should, again check this in to source control, adding comments as required:   Also worth a quick check that your build still runs with the new additions!: (And a quick check of the RedGateAppCI database shows that the changes have been made).   Creating and Testing a Unit Test There are, of course, a lot of very interesting unit tests that you could and should set up for a database. The great thing about the tSQLt framework is that you can write these in SQL. The example I’m going to use here is pretty Mickey Mouse – our database table is going to include some email addresses as reference data and I want to check whether these are all in a correct email format. Nothing clever but it illustrates the process and hopefully shows the method by which more interesting tests could be set up. Adding Reference Data to our Database To start, I want to add some reference data to my database, and have this source controlled (as well as the schema). First of all I need to add some data in to my solitary table – this can be done a number of ways, but I’ll do this in SSMS for simplicity: I then add some reference data to my table: Currently this reference data just exists in the database. For proper integration testing, this needs to form part of the source-controlled version of the database – and so needs to be added to the Git repository. This can be done via SQL Source Control, though first a Primary Key needs to be added to the table. Right click the table, select Design, then right-click on the first “id” row. Then click on “Set Primary Key”: NB: once this change is made, click Save to save the change to the table. Then, to source control this reference data, right click on the table (dbo.Email) and selecting the following option:   In the next screen, link the data in the Email table, by selecting it from the list and clicking “save and close”: We should at this point re-commit the changes (both the addition of the Primary Key, and the data) to the Git repo. NB: From here on, I won’t show screenshots for the GitHub side of things – it’s the same each time: whenever a change is made in SQL Source Control and committed to your local folder, you then need to sync this in the GitHub Windows client (as this is where the build server, Bamboo is taking it from). An interesting point to note here, when these changes are committed in SQL Source Control (right-click database and select “Commit Changes to Source Control..”): The display gives a warning about possibly needing a migration script for the “Add Primary Key” step of the changes. This isn’t actually necessary in this case, but this mechanism would allow you to create override scripts to replace the default change scripts created by the SQL Compare engine (which runs underneath SQL Source Control). Ignoring this message (!), we add a comment and commit the changes to Git. I then sync these, run a build (or the build gets run automatically), and check that the data is being deployed over to the target RedGateAppCI database:   Creating and Running the Test As I mention, the test I’m going to use here is a very simple one - are the email addresses in my reference table valid? This isn’t of course, a full test of email validation (I expect the email addresses I’ve chosen here aren’t really the those of the Fab Four) – but just a very basic check of format used. I’ve taken the relevant SQL from this Stack Overflow article. In SSMS select “SQL Test” from the Tools menu, then click on + New Test: In the next screen, give your new test a name, and also enter a name in the Test Class box (test classes are schemas that help you keep things organised). Also check that the database in which the test is going to be created is correct – RedGateApp in this example: Click “Create Test”. After closing a couple of subsequent dialogues, you’ll see a dummy script for the test, that needs filling in:   We now need to define the SQL for our test. As mentioned before, tSQLt allows you to write your unit tests in T-SQL, and the code I’m going to use here is as below. This needs to be copied and pasted in to the query window, to replace the default given by tSQLt: –  Basic email check test ALTER PROCEDURE [MyChecks].[test Check Email Addresses] AS BEGIN SET NOCOUNT ON         Declare @Output VarChar(max)     Set @Output = ”       SELECT  @Output = @Output + Email +Char(13) + Char(10) FROM dbo.Email WHERE email NOT LIKE ‘%_@__%.__%’       If @Output > ”         Begin             Set @Output = Char(13) + Char(10)                           + @Output             EXEC tSQLt.Fail@Output         End   END;   Once this script is entered, hit execute to add the Stored Procedure to the database. Before committing the test to source control,  it’s worth just checking that it works! For a positive test, click on “SQL Test” from the Tools menu, then click Run Tests. You should see output like the following: - a green tick to indicate success! But of course, what we also need to do is test that this is actually doing something by showing a failed test. Edit one of the email addresses in your table to an incorrect format: Now, re-run the same SQL Test as before and you’ll see the following: Great – we now know that our test is really doing something! You’ll also see a useful error message at the bottom of SSMS: (leave the email address as invalid for now, for the next steps). The next stage is to check this new test in to source control again, by right-clicking on the database and checking in the changes with a commit message (and not forgetting to sync in the GitHub client):   Checking that the Tests are Running as Integration Tests After the changes above are made, and after a build has run on Bamboo (manual or automatic), looking at the Stored Procedures for the RedGateAppCI, the SPROC for the new test has been moved over to the database. However this is not exactly what we were after. We didn’t want to just copy objects from one database to another, but actually run the tests as part of the build/integration test process. I.e. we’re continuously checking any changes we make (in this case, to the reference data emails), to ensure we’re not breaking a test that we’ve set up. The behaviour we want to see is that, if we check in static data that is incorrect (as we did in step 9 above) and we have the tSQLt test set up, then our build in Bamboo should fail. However, re-running the build shows the following: - sadly, a successful build! To make sure the tSQLt tests are run as part of the integration test, we need to amend a switch in the Red Gate CI config file. First, navigate to file sqlCI.targets in your working folder: Edit this document, make the following change, save the document, then commit and sync this change in the GitHub client: <!-- tSQLt tests --> <!-- Optional --> <!-- To run tSQLt tests in source control for the database, enter true. --> <enableTsqlt>true</enableTsqlt> Now, if we re-run the build in Bamboo (NB: I’ve moved to a new server here, hence different address and build number): - superb, a broken build!! The error message isn’t great here, so to get more detailed info, click on the full build log link on this page (below the fold). The interesting part of the log shown is towards the bottom. Pulling out this part:   21-Jun-2013 11:35:19 Build FAILED. 21-Jun-2013 11:35:19 21-Jun-2013 11:35:19 "C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj" (default target) (1) -> 21-Jun-2013 11:35:19 (sqlCI target) -> 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: RedGate.Deploy.SqlServerDbPackage.Shared.Exceptions.InvalidSqlException: Test Case Summary: 1 test case(s) executed, 0 succeeded, 1 failed, 0 errored. [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: [MyChecks].[test Check Email Addresses] failed: [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: ringo.starr@beatles [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: +----------------------+ [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: |Test Execution Summary| [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj]   As a final check, we should make sure that, if we now fix this error, the build succeeds. So in SSMS, I’m going to correct the invalid email address, then check this change in to SQL Source Control (with a comment), commit to GitHub, and re-run the build:   This should have fixed the build: It worked! Summary This has been a very quick run through the implementation of CI for databases, including tSQLt tests to test whether your database updates are working. The next post in this series will focus on automated deployment – we’ve tested our database changes, how can we now deploy these to target sites?  

    Read the article

  • libgtk2.0-common fails to build with Gdk-2.0.gir error, Type reference 'GdkPixbuf' not found

    - by Stefano Palazzo
    I'm trying to build gtk, but it fails. Here's what I'm doing: sudo apt-get build-dep libgtk2.0-common sudo apt-get source libgtk2.0-common cd gtk+2.0-2.22.0/ sudo gedit gtk/gtktreeview.c & #...editing a few files (or not, it's the same error) sudo ./configure --prefix=/usr sudo make The compilation runs for a while and then quits: Gdk-2.0.gir: error: Type reference 'GdkPixbuf' not found ... make: *** [all] Error 2 What am I doing wrong?

    Read the article

  • "'/usr/share/app-install/desktop/software-center.menu': Not a directory" error when trying to reinstall the software center

    - by EnTer
    I was having some problem with Software center. So tried to reinstall it. But While removing it give me an error. (Reading database ... 150986 files and directories currently installed.) Removing software-center ... dpkg: error processing software-center (--remove): unable to securely remove '/usr/share/app-install/desktop/software-center.menu': Not a directory Processing triggers for man-db ... Processing triggers for hicolor-icon-theme ... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Errors were encountered while processing: software-center E: Sub-process /usr/bin/dpkg returned an error code (1) Now The same problem is repeating if I use apt-get to install or upgrade my ubuntu. I cant install any software and Software-Center too.. PLZ help

    Read the article

  • Installing a directory with a Debian Package

    - by Meisie
    Hi guys I want to create a Debian Package that installs a bunch of Folders to a system but I can't get it working. The Package gets created without any errors and lintian also says it's okay but installing does nothing. The rules file looks like this: <#>!/usr/bin/make -f logs = $(CURDIR)/shell_logs/ DEST1 = /opt/Pacetutor/ build: build-stamp build-stamp: dh_testdir touch build-stam clean: dh_testdir dh_testroot rm -f build-stamp dh_clean install: build clean $(logs) dh_testdir dh_testroot dh_prep dh_installdirs mkdir -m 755 -p $(DEST1) <- this is propably optional or not needed -> cp -r $(logs) $(DEST1) <- using mv works but thats not what I want. -> binary-indep: build install dh_testdir dh_testroot dh_installchangelogs dh_installdocs dh_installexamples dh_installman dh_link dh_compress dh_fixperms dh_installdeb dh_gencontrol dh_md5sums dh_builddeb binary-arch: build install binary: binary-indep binary-arch .PHONY: build clean binary-indep binary-arch binary install

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >