Search Results

Search found 17069 results on 683 pages for 'build monkey'.

Page 73/683 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • How do you build a Windows Workflow Project with NAnt 0.90?

    - by LockeCJ
    I'm trying to build a Windows Workflow (WF) project using NAnt, but it doesn;t seem to be able to build the ".xoml" and ".rules" files. Here is the code of the csc task that I'm using: <csc debug="${build.Debug}" warninglevel="${build.WarningLevel}" target="library" output="${path::combine(build.OutputDir,assembly.Name+'.dll')}" verbose="${build.Verbose}" doc="${path::combine(build.OutputDir,assembly.Name+'.xml')}"> <sources basedir="${assembly.BaseDir}"> <include name="**/*.cs" /> <include name="**/*.xoml" /> <include name="**/*.rules" /> </sources> <resources basedir="${assembly.BaseDir}"> <include name="**/*.xsd" /> <include name="**/*.resx" /> </resources> <references> ... </references> </csc> Here's the output: Compiling 21 files to 'c:\Output\MyWorkFlowProject.dll'. [csc] c:\Projects\MyWorkFlowProject\AProcessFlow.xoml(1,1): error CS0116: A namespace does not directly contain members such as fields or methods [csc] c:\Projects\MyWorkFlowProject\BProcessFlow.xoml(1,1): error CS0116: A namespace does not directly contain members such as fields or methods [csc] c:\Projects\MyWorkFlowProject\CProcessFlow.rules(1,1): error CS0116: A namespace does not directly contain members such as fields or methods [csc] c:\Projects\MyWorkFlowProject\CProcessFlow.xoml(1,1): error CS0116: A namespace does not directly contain members such as fields or methods

    Read the article

  • Accessing TFS from Powershell

    - by w4ymo
    Hello I am new to PowerShell and I am trying to get branches from TFS and merge them using a PowerShell script. Unfortunately I am failing a first hurdle. I do have Visual Studio 2010 install on my local machine and can access the TFS server (also 2010) fine. I am running the script from my local machine and have the following lines: $tfs = get-tfs http://TFSServerName:8080/TFSProject $branchfolders = $tfs.VCS.GetItems('$/Dev/Branches/', $tfs.RecursionType::OneLevel) and I receive the following error on the second line 2 above Exception calling "GetItems" with "2" argument(s): "Unable to connect to the remote server" I have configured the TFS server to accept incoming connections on port 8080 which works but I am now not to sure how to resolve this error. Is further configuration required? Thanks for any help given.

    Read the article

  • Why would autoconf/automake project link against installed library instead of local development libr

    - by Beau Simensen
    I'm creating a library libgdata that has some tests and non-installed programs. I am running into the problem that once I've installed the library once, the programs seem to be linking to the installed version and not the local version in ../src/libgdata.la any longer. What could cause this? Am I doing something horribly wrong? Here is what my test/Makefile.am looks like: INCLUDES = -I$(top_srcdir)/src/ -I$(top_srcdir)/test/ # libapiutil contains all of our dependencies! AM_CXXFLAGS = $(APIUTIL_CFLAGS) AM_LDFLAGS = $(APIUTIL_LIBS) LDADD = $(top_builddir)/src/libgdata.la noinst_PROGRAMS = gdatacalendar gdatayoutube gdatacalendar_SOURCES = gdatacalendar.cc gdatayoutube_SOURCES = gdatayoutube.cc TESTS = check_bare check_PROGRAMS = $(TESTS) check_bare_SOURCES = check_bare.cc (libapiutil is another library that has some helper stuff for dealing with libcurl and libxml++) So, for instance, if I run the tests without having installed anything, everything works fine. I can make changes locally and they are picked up by these programs right away. If I install the package, these programs will compile (it seems like it does actually look locally for the headers), but once I run the program it complains about missing symbols. As far as I can tell, it is linking against the newly built library (../src/libgdata.la) based on the make output, so I'm not sure why this would be happening. If i remove the installed files, the local changes to src/* are picked up just fine. I've included the make output for gdatacalendar below. g++ -DHAVE_CONFIG_H -I. -I.. -I../src/ -I../test/ -I/home/altern8/workspaces/4355/dev-install/include -I/usr/include/libxml++-2.6 -I/usr/lib/libxml++-2.6/include -I/usr/include/libxml2 -I/usr/include/glibmm-2.4 -I/usr/lib/glibmm-2.4/include -I/usr/include/sigc++-2.0 -I/usr/lib/sigc++-2.0/include -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -g -O2 -MT gdatacalendar.o -MD -MP -MF .deps/gdatacalendar.Tpo -c -o gdatacalendar.o gdatacalendar.cc mv -f .deps/gdatacalendar.Tpo .deps/gdatacalendar.Po /bin/bash ../libtool --tag=CXX --mode=link g++ -I/home/altern8/workspaces/4355/dev-install/include -I/usr/include/libxml++-2.6 -I/usr/lib/libxml++-2.6/include -I/usr/include/libxml2 -I/usr/include/glibmm-2.4 -I/usr/lib/glibmm-2.4/include -I/usr/include/sigc++-2.0 -I/usr/lib/sigc++-2.0/include -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -g -O2 -L/home/altern8/workspaces/4355/dev-install/lib -lapiutil -lcurl -lgssapi_krb5 -lxml++-2.6 -lxml2 -lglibmm-2.4 -lgobject-2.0 -lsigc-2.0 -lglib-2.0 -o gdatacalendar gdatacalendar.o ../src/libgdata.la mkdir .libs g++ -I/home/altern8/workspaces/4355/dev-install/include -I/usr/include/libxml++-2.6 -I/usr/lib/libxml++-2.6/include -I/usr/include/libxml2 -I/usr/include/glibmm-2.4 -I/usr/lib/glibmm-2.4/include -I/usr/include/sigc++-2.0 -I/usr/lib/sigc++-2.0/include -I/usr/include/glib-2.0 -I/usr/lib/glib-2.0/include -g -O2 -o .libs/gdatacalendar gdatacalendar.o -L/home/altern8/workspaces/4355/dev-install/lib /home/altern8/workspaces/4355/dev-install/lib/libapiutil.so /usr/lib/libcurl.so -lgssapi_krb5 /usr/lib/libxml++-2.6.so /usr/lib/libxml2.so /usr/lib/libglibmm-2.4.so /usr/lib/libgobject-2.0.so /usr/lib/libsigc-2.0.so /usr/lib/libglib-2.0.so ../src/.libs/libgdata.so -Wl,--rpath -Wl,/home/altern8/workspaces/4355/dev-install/lib creating gdatacalendar Help. :) UPDATE I get the following messages when I try to run the calendar program when I've added the addCommonRequestHeader() method to the Service class after I had installed the library without the addCommonRequestHeader() method. /home/altern8/workspaces/4355/libgdata/test/.libs/lt-gdatacalendar: symbol lookup error: /home/altern8/workspaces/4355/libgdata/test/.libs/lt-gdatacalendar: undefined symbol: _ZN55gdata7service7Service22addCommonRequestHeaderERKSsS4_ Eugene's suggestion to try setting the $LD_LIBRARY_PATH variable did not help. UPDATE 2 I did two tests. First, I did this after blowing away my dev-install directory (--prefix) and in that case, it creates test/.libs/lt-gdatacalendar. Once I have installed the library, though, it creates test/.libs/gdatacalendar instead. The output of ldd is the same for both with one exception: # before install # ldd test/.libs/lt-gdatacalendar libgdata.so.0 => /home/altern8/workspaces/4355/libgdata/src/.libs/libgdata.so.0 (0xb7c32000) # after install # ldd test/.libs/gdatacalendar libgdata.so.0 => /home/altern8/workspaces/4355/dev-install/lib/libgdata.so.0 (0xb7c87000) What would cause this to create lt-gdatacalendar in one case but gdatacalendar in another? The output of ldd on libgdata is: altern8@goldfrapp:~/workspaces/4355/libgdata$ ldd /home/altern8/workspaces/4355/libgdata/src/.libs/libgdata.so.0 linux-gate.so.1 => (0xb7f7c000) libgcc_s.so.1 => /lib/libgcc_s.so.1 (0xb7f3b000) libc.so.6 => /lib/tls/i686/cmov/libc.so.6 (0xb7dec000) /lib/ld-linux.so.2 (0xb7f7d000)

    Read the article

  • How do I get rid of "API restriction UnitTestFramework.dll already loaded" error?

    - by Kevin Driedger
    The following error pops up every now and then: C:\Program Files\MSBuild\Microsoft\VisualStudio\v9.0\TeamTest\Microsoft.TeamTest.targets(14,5): error : API restriction: The assembly 'file:///C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PublicAssemblies\Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll' has already loaded from a different location. It cannot be loaded from a new location within the same appdomain. How do I get rid of it?

    Read the article

  • noClassDefFoundError using Scala Plugin for Eclipse

    - by Jacob Lyles
    I successfully implemented and ran several Scala tutorials in Eclipse using the Scala plugin. Then suddenly I tried to compile and run an example, and this error came up: Exception in thread "main" java.lang.NoClassDefFoundError: hello/HelloWorld Caused by: java.lang.ClassNotFoundException: hello.HelloWorld at java.net.URLClassLoader$1.run(URLClassLoader.java:200) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:188) at java.lang.ClassLoader.loadClass(ClassLoader.java:315) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:330) at java.lang.ClassLoader.loadClass(ClassLoader.java:250) at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:398) After this point I could no longer run any Scala programs in Eclipse. I tried cleaning and rebuilding my project, closing and reopening my project, and closing and reopening Eclipse. Eclipse version number 3.5.2 and Scala plugin 2.8.0 Here is the original code: package hello object HelloWorld { def main(args: Array[String]){ println("hello world") } }

    Read the article

  • CI Deployment Of Azure Web Roles Using TeamCity

    - by srkirkland
    After recently migrating an important new website to use Windows Azure “Web Roles” I wanted an easier way to deploy new versions to the Azure Staging environment as well as a reliable process to rollback deployments to a certain “known good” source control commit checkpoint.  By configuring our JetBrains’ TeamCity CI server to utilize Windows Azure PowerShell cmdlets to create new automated deployments, I’ll show you how to take control of your Azure publish process. Step 0: Configuring your Azure Project in Visual Studio Before we can start looking at automating the deployment, we should make sure manual deployments from Visual Studio are working properly.  Detailed information for setting up deployments can be found at http://msdn.microsoft.com/en-us/library/windowsazure/ff683672.aspx#PublishAzure or by doing some quick Googling, but the basics are as follows: Install the prerequisite Windows Azure SDK Create an Azure project by right-clicking on your web project and choosing “Add Windows Azure Cloud Service Project” (or by manually adding that project type) Configure your Role and Service Configuration/Definition as desired Right-click on your azure project and choose “Publish,” create a publish profile, and push to your web role You don’t actually have to do step #4 and create a publish profile, but it’s a good exercise to make sure everything is working properly.  Once your Windows Azure project is setup correctly, we are ready to move on to understanding the Azure Publish process. Understanding the Azure Publish Process The actual Windows Azure project is fairly simple at its core—it builds your dependent roles (in our case, a web role) against a specific service and build configuration, and outputs two files: ServiceConfiguration.Cloud.cscfg: This is just the file containing your package configuration info, for example Instance Count, OsFamily, ConnectionString and other Setting information. ProjectName.Azure.cspkg: This is the package file that contains the guts of your deployment, including all deployable files. When you package your Azure project, these two files will be created within the directory ./[ProjectName].Azure/bin/[ConfigName]/app.publish/.  If you want to build your Azure Project from the command line, it’s as simple as calling MSBuild on the “Publish” target: msbuild.exe /target:Publish Windows Azure PowerShell Cmdlets The last pieces of the puzzle that make CI automation possible are the Azure PowerShell Cmdlets (http://msdn.microsoft.com/en-us/library/windowsazure/jj156055.aspx).  These cmdlets are what will let us create deployments without Visual Studio or other user intervention. Preparing TeamCity for Azure Deployments Now we are ready to get our TeamCity server setup so it can build and deploy Windows Azure projects, which we now know requires the Azure SDK and the Windows Azure PowerShell Cmdlets. Installing the Azure SDK is easy enough, just go to https://www.windowsazure.com/en-us/develop/net/ and click “Install” Once this SDK is installed, I recommend running a test build to make sure your project is building correctly.  You’ll want to setup your build step using MSBuild with the “Publish” target against your solution file.  Mine looks like this: Assuming the build was successful, you will now have the two *.cspkg and *cscfg files within your build directory.  If the build was red (failed), take a look at the build logs and keep an eye out for “unsupported project type” or other build errors, which will need to be addressed before the CI deployment can be completed. With a successful build we are now ready to install and configure the Windows Azure PowerShell Cmdlets: Follow the instructions at http://msdn.microsoft.com/en-us/library/windowsazure/jj554332 to install the Cmdlets and configure PowerShell After installing the Cmdlets, you’ll need to get your Azure Subscription Info using the Get-AzurePublishSettingsFile command. Store the resulting *.publishsettings file somewhere you can get to easily, like C:\TeamCity, because you will need to reference it later from your deploy script. Scripting the CI Deploy Process Now that the cmdlets are installed on our TeamCity server, we are ready to script the actual deployment using a TeamCity “PowerShell” build runner.  Before we look at any code, here’s a breakdown of our deployment algorithm: Setup your variables, including the location of the *.cspkg and *cscfg files produced in the earlier MSBuild step (remember, the folder is something like [ProjectName].Azure/bin/[ConfigName]/app.publish/ Import the Windows Azure PowerShell Cmdlets Import and set your Azure Subscription information (this is basically your authentication/authorization step, so protect your settings file Now look for a current deployment, and if you find one Upgrade it, else Create a new deployment Pretty simple and straightforward.  Now let’s look at the code (also available as a gist here: https://gist.github.com/3694398): $subscription = "[Your Subscription Name]" $service = "[Your Azure Service Name]" $slot = "staging" #staging or production $package = "[ProjectName]\bin\[BuildConfigName]\app.publish\[ProjectName].cspkg" $configuration = "[ProjectName]\bin\[BuildConfigName]\app.publish\ServiceConfiguration.Cloud.cscfg" $timeStampFormat = "g" $deploymentLabel = "ContinuousDeploy to $service v%build.number%"   Write-Output "Running Azure Imports" Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\*.psd1" Import-AzurePublishSettingsFile "C:\TeamCity\[PSFileName].publishsettings" Set-AzureSubscription -CurrentStorageAccount $service -SubscriptionName $subscription   function Publish(){ $deployment = Get-AzureDeployment -ServiceName $service -Slot $slot -ErrorVariable a -ErrorAction silentlycontinue   if ($a[0] -ne $null) { Write-Output "$(Get-Date -f $timeStampFormat) - No deployment is detected. Creating a new deployment. " } if ($deployment.Name -ne $null) { #Update deployment inplace (usually faster, cheaper, won't destroy VIP) Write-Output "$(Get-Date -f $timeStampFormat) - Deployment exists in $servicename. Upgrading deployment." UpgradeDeployment } else { CreateNewDeployment } }   function CreateNewDeployment() { write-progress -id 3 -activity "Creating New Deployment" -Status "In progress" Write-Output "$(Get-Date -f $timeStampFormat) - Creating New Deployment: In progress"   $opstat = New-AzureDeployment -Slot $slot -Package $package -Configuration $configuration -label $deploymentLabel -ServiceName $service   $completeDeployment = Get-AzureDeployment -ServiceName $service -Slot $slot $completeDeploymentID = $completeDeployment.deploymentid   write-progress -id 3 -activity "Creating New Deployment" -completed -Status "Complete" Write-Output "$(Get-Date -f $timeStampFormat) - Creating New Deployment: Complete, Deployment ID: $completeDeploymentID" }   function UpgradeDeployment() { write-progress -id 3 -activity "Upgrading Deployment" -Status "In progress" Write-Output "$(Get-Date -f $timeStampFormat) - Upgrading Deployment: In progress"   # perform Update-Deployment $setdeployment = Set-AzureDeployment -Upgrade -Slot $slot -Package $package -Configuration $configuration -label $deploymentLabel -ServiceName $service -Force   $completeDeployment = Get-AzureDeployment -ServiceName $service -Slot $slot $completeDeploymentID = $completeDeployment.deploymentid   write-progress -id 3 -activity "Upgrading Deployment" -completed -Status "Complete" Write-Output "$(Get-Date -f $timeStampFormat) - Upgrading Deployment: Complete, Deployment ID: $completeDeploymentID" }   Write-Output "Create Azure Deployment" Publish   Creating the TeamCity Build Step The only thing left is to create a second build step, after your MSBuild “Publish” step, with the build runner type “PowerShell”.  Then set your script to “Source Code,” the script execution mode to “Put script into PowerShell stdin with “-Command” arguments” and then copy/paste in the above script (replacing the placeholder sections with your values).  This should look like the following:   Wrap Up After combining the MSBuild /target:Publish step (which creates the necessary Windows Azure *.cspkg and *.cscfg files) and a PowerShell script step which utilizes the Azure PowerShell Cmdlets, we have a fully deployable build configuration in TeamCity.  You can configure this step to run whenever you’d like using build triggers – for example, you could even deploy whenever a new master branch deploy comes in and passes all required tests. In the script I’ve hardcoded that every deployment goes to the Staging environment on Azure, but you could deploy straight to Production if you want to, or even setup a deployment configuration variable and set it as desired. After your TeamCity Build Configuration is complete, you’ll see something that looks like this: Whenever you click the “Run” button, all of your code will be compiled, published, and deployed to Windows Azure! One additional enormous benefit of automating the process this way is that you can easily deploy any specific source control changeset by clicking the little ellipsis button next to "Run.”  This will bring up a dialog like the one below, where you can select the last change to use for your deployment.  Since Azure Web Role deployments don’t have any rollback functionality, this is a critical feature.   Enjoy!

    Read the article

  • How to reference both ASSEMBLYVERSION and ASSEMBLYFILEVERSION?

    - by Chuck
    I need to display both the AssemblyVersion and the AssemblyFileVersion. In AssemblyInfo.cs, I have: [assembly: AssemblyVersion("1.0.*")] [assembly: AssemblyFileVersion("2009.8.0")] However, I only get "2009.8.0" when I reference the above with: public class VersionInfo { public static string AppVersion() { return System.Diagnostics.FileVersionInfo.GetVersionInfo(System.Reflection.Assembly.GetExecutingAssembly().Location).FileMajorPart + "." + System.Diagnostics.FileVersionInfo.GetVersionInfo(System.Reflection.Assembly.GetExecutingAssembly().Location).FileMinorPart + "." + System.Diagnostics.FileVersionInfo.GetVersionInfo(System.Reflection.Assembly.GetExecutingAssembly().Location).FileBuildPart; } } How can I display both values? Thanks.

    Read the article

  • Makefiles - Compile all .cpp files in src/ to .o's in obj/, then link to binary in /

    - by Austin Hyde
    So, my project directory looks like this: /project Makefile main /src main.cpp foo.cpp foo.h bar.cpp bar.h /obj main.o foo.o bar.o What I would like my makefile to do would be to compile all .cpp files in the /src folder to .o files in the /obj folder, then link all the .o files in /obj into the output binary in the root folder /project. The problem is, I have next to no experience with Makefiles, and am not really sure what to search for to accomplish this. Also, is this a "good" way to do this, or is there a more standard approach to what I'm trying to do?

    Read the article

  • How to update maven repository in eclipse?

    - by Stephane Grenier
    Assuming you're already using the m2eclipse plugin, what can you do it doesn't update the dependencies to the latest in your repo. For example, on the command line you can just add the -U flag as in: mvn clean install -U to force the dependencies to be updated. Is there something like this within eclipse since it doesn't always seem to pick up the latest updates.

    Read the article

  • What is "UseRANU" parameter in Visual Studio

    - by sudarsanyes
    I have created a package in VS2010 RC using the MPF (Managed Package Framework) and I get the following error. Can somebody help me out with this ?? The "UseRANU" parameter is not supported by the "VsTemplatePaths" task. Verify the parameter exists on the task, and it is a settable public instance property. The "VsTemplatePaths" task could not be initialized with its input parameters.

    Read the article

  • Shark was unable to find symbol information for this address range - iPhone

    - by Elliot
    I'm trying to use Shark to determine which method(s) are taking the most time in my iPhone app. After sampling, I get this: Clicking the "!" button yields: Shark was unable to find symbol information for this address range. Typically this happens because the application was compiled without symbols or they have been subsequently stripped away. In Xcode, make sure the "Generate Debug Symbols" checkbox is selected (passes the -g flag to the compiler). Note that this does not affect code optimization, and does not typically alter performance significantly. However, the extra symbol information does consume significantly more space and may bloat the size of the executable. But I AM using the Debug option, and I am running on my Device. And Generate Debug Symbols IS checked. So what's wrong?

    Read the article

  • Continous integration with .net and svn

    - by stiank81
    We're currently not applying the automated building and testing of continous integration in our project. We haven't bothered this far as we're only 2 developers working on it, but even with a team of 2 I still think it would be valuable to use continous integration and get a confirmation that our builds don't break or tests start failing. We're using .Net with C# and WPF. We have created Python-scripts for building the application - using MSbuild - and for running all tests. Our source is in SVN. What would be the best approach to apply continous integration with this setup? What tool should we get? It should be one which doesn't require alot of setup. Simple procedures to get started and little maintanance is a must.

    Read the article

  • building SQL Query From another Query in php

    - by Nina
    Hello when I Try to built Query from another Query in php code I Faced some problem can you tell me why? :( code : $First="SELECT ro.RoomID,ro.RoomName,ro.RoomLogo,jr.RoomID,jr.MemberID,ro.RoomDescription FROM joinroom jr,rooms ro where (ro.RoomID=jr.RoomID)AND jr.MemberID = '1' "; $sql1 = mysql_query($First); $constract .= "ro.RoomName LIKE '%$search_each%'"; $constract="SELECT * FROM $sql1 WHERE $constract ";// This statment is Make error

    Read the article

  • LibPNG + Boost::GIL: png_infopp_NULL not found

    - by Viet
    Hi, I always get this error when trying to compile my file with Boost::GIL PNG IO support: (I'm running Mac OS X Leopard and Boost 1.42, LibPNG 1.4) /usr/local/include/boost/gil/extension/io/png_io_private.hpp: In member function 'void boost::gil::detail::png_reader::init()': /usr/local/include/boost/gil/extension/io/png_io_private.hpp:155: error: 'png_infopp_NULL' was not declared in this scope /usr/local/include/boost/gil/extension/io/png_io_private.hpp:160: error: 'png_infopp_NULL' was not declared in this scope /usr/local/include/boost/gil/extension/io/png_io_private.hpp: In destructor 'boost::gil::detail::png_reader::~png_reader()': /usr/local/include/boost/gil/extension/io/png_io_private.hpp:174: error: 'png_infopp_NULL' was not declared in this scope /usr/local/include/boost/gil/extension/io/png_io_private.hpp: In member function 'void boost::gil::detail::png_reader::apply(const View&)': /usr/local/include/boost/gil/extension/io/png_io_private.hpp:186: error: 'int_p_NULL' was not declared in this scope /usr/local/include/boost/gil/extension/io/png_io_private.hpp: In member function 'void boost::gil::detail::png_reader_color_convert<CC>::apply(const View&)': /usr/local/include/boost/gil/extension/io/png_io_private.hpp:228: error: 'int_p_NULL' was not declared in this scope /usr/local/include/boost/gil/extension/io/png_io_private.hpp: In member function 'void boost::gil::detail::png_writer::init()': /usr/local/include/boost/gil/extension/io/png_io_private.hpp:317: error: 'png_infopp_NULL' was not declared in this scope

    Read the article

  • Abusing the word "library"

    - by William Pursell
    I see a lot of questions, both here on SO and elsewhere, about "maintaining common libraries in a VCS". That is, projects foo and bar both depend on libbaz, and the questioner is wondering how they should import the source for libbaz into the VCS for each project. My question is: WTF? If libbaz is a library, then foo doesn't need its source code at all. There are some libraries that are reasonably designed to be used in this manner (eg gnulib), but for the most part foo and bar ought to just link against the library. I guess my thinking is: if you cut-and-paste source for a library into your own source tree, then you obviously don't care about future updates to the library. If you care about updates, then just link against the library and trust the library maintainers to maintain a stable API. If you don't trust the API to remain stable, then you can't blindly update your own copy of the source anyway, so what is gained? To summarize the question: why would anyone want to maintain a copy of a library in the source code for a project rather than just linking against that library and requiring it as a dependency? If the only answer is "don't want the dependency", then why not just distribute a copy of the library along with your app, but keep them totally separate?

    Read the article

  • VIsual Studio and Ajax Control Toolkit

    - by Steve
    In my web application VS 2008 solution, I have the AjaxControlToolkit.dll in my bin directory and a whole set of language directories for it (ar, cs, de,es, fr, he, etc...) I don't remember how the language directories got in there. If I am using other languages via the ACT, do I need these directories? If not, then I don't? When I do a rebuild solution, the dll (AjaxControlToolkit.resources.dll) in these directories disappear. If I need them, what do I need to do to keep them from being deleted during a rebuild?

    Read the article

  • Gradle directory stucture

    - by liam.j.bennett
    I am working on a java Ant+Ivy based project that has the following directory structure: projectRoot/src projectRoot/classes projectRoot/conf projectRoot/webservices this works perfectly well in Ant but I am looking to migrate to Gradle. Is there a way to define a non-maven directory structure in gradle or should I be looking to mavenize?

    Read the article

  • Javascript Buildmanagement - "Must have" tools?

    - by lajuette
    Are there any must have tools for Java Script (RIA) development like maven, jUnit, Emma, link4j etc. for Javascript? What is the best way to set up a continous integration system for a bigger application or framework? How do projects like jQuery test their code? How to manage dependencies and different project configurations? tools i know so far: javascript-maven-tools (is maven the right choice?) jslint yuicompressor sprockets (found it 5 mins ago) jsunit selenium

    Read the article

  • Maven multi-module project with many reports: looking for an example

    - by hstoerr
    Is there an open source project that can serve as a good example on how to use the maven site plugin to generate reports? I would prefer it to consist of many modules, possibly hierarchically structured use as many plugins as possible (surefire, jxr, pmd, findbugs, javadoc, checkstyle, you name it) the reports should be aggregated: if some tests fail you want to have a single page that shows all modules with failing tests, not only a gazillion individual pages to check include enterprisey stuff (WAR, EAR etc), but this is not so important. The idea is to have something where you can gather ideas on how it is done and what is possible.

    Read the article

  • How can I compile GCC as a static binary?

    - by CaCl
    How can I compile the GCC Compiler so that I can pull the entire thing over to another system and use the program? I don't mind pulling in other files as well, but is there a way to gather all the required system libs as well? The OS and Arch will remain constant across the different systems, but one may contain Slackware where the other contains Debian.

    Read the article

  • Makefile: finding include/lib for libraries installed through macports

    - by Henk
    Libraries/include files installed by macports go in /opt/local/lib and /opt/local/include, neither of which are scanned by gcc/ld by default. As a result, a project I'm working on won't compile in that environment. Should this be fixed by manually adding -L/opt/local/lib to my Makefile's LDFLAGS (and -I... as well), or is there some configuration that should be done to fix this globally on the computer?

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >