Search Results

Search found 27426 results on 1098 pages for 'build tools'.

Page 11/1098 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Tools to test multicast routing

    - by Zoredache
    I am looking for a good simple tool that runs on a standard OS (Windows or Linux) that I can used to test that multicast is being passed properly by a router. I have been asked by a client to enable multicast routing on a Linux box acting as their router since their phone system requires multicast to for a few features. Since I am not physically near the client I don't really have the ability to experiment with the various methods for setting up multicast routing on Linux. I can setup a router at my desk that is identical to what is deployed on their network, but I don't know of any good simple tools that I can use to generate or listen for multicast traffic. The one mulicast tool I have found is mcast.exe tool which is part of the Windows 2000/2003 resource kit. From what I have read online it seems that mcast.exe does not work across a router, and only works on the local network, so that doesn't seem to be useful for me to test multicast routing. So what do tool(s) do you use to test that multicast routing is properly setup?

    Read the article

  • Free Multi monitor tools

    - by vaccano
    If this has been asked before I apologize (I did not see it in the "related questions"). What are some good free (yet spyware and adware free) multi-monitor tools? My trial of Ultramon just ran out and (sadly) my boss will not spring for a license. I want features like multiple task bars and Maximized window dragging (and/or other cool features I don't know about). I know that there is at least one out there that is free and allows moving windows easily between monitors. (Cause I had it before, but I can't remember its name.) I want to install it again or something similar/better.

    Read the article

  • terminal tools and logs for debugging TCP issues

    - by kellogs
    I have a server which I am testing for functionality (not load, not stress) with tsung. 50 users / second, 100 total users. Judging from tsung (tsung is the testing framework) graphs, there TCP connections (red line) drops to 0 while the commenced user sessions (green line) does not. Server logs show nothing to be gripping onto, so I am speculating some kind of TCP issue. Should this be the case ? Where would I look further on the server, any logs / tools to be looking at ? Only SSH available, no GUI. > root@XMPP:~# cat /etc/lsb-release > DISTRIB_ID=Ubuntu > DISTRIB_RELEASE=11.10 > DISTRIB_CODENAME=oneiric > DISTRIB_DESCRIPTION="Ubuntu 11.10" Thank you

    Read the article

  • Why do I get the error "Only antlib URIs can be located from the URI alone,not the URI" when trying to run hibernate tools in my build.xml

    - by Casbah
    I'm trying to run hibernate tools in an ant build to generate ddl from my JPA annotations. Ant dies on the taskdef tag. I've tried with ant 1.7, 1.6.5, and 1.6 to no avail. I've tried both in eclipse and outside. I've tried including all the hbn jars in the hibernate-tools path and not. Note that I based my build file on this post: http://stackoverflow.com/questions/281890/hibernate-jpa-to-ddl-command-line-tools I'm running eclipse 3.4 with WTP 3.0.1 and MyEclipse 7.1 on Ubuntu 8. Build.xml: <project name="generateddl" default="generate-ddl"> <path id="hibernate-tools"> <pathelement location="../libraries/hibernate-tools/hibernate-tools.jar" /> <pathelement location="../libraries/hibernate-tools/bsh-2.0b1.jar" /> <pathelement location="../libraries/hibernate-tools/freemarker.jar" /> <pathelement location="../libraries/jtds/jtds-1.2.2.jar" /> <pathelement location="../libraries/hibernate-tools/jtidy-r8-20060801.jar" /> </path> <taskdef classname="org.hibernate.tool.ant.HibernateToolTask" classpathref="hibernate-tools"/> <target name="generate-ddl" description="Export schema to DDL file"> <!-- compile model classes before running hibernatetool --> <!-- task definition; project.class.path contains all necessary libs <taskdef name="hibernatetool" classname="org.hibernate.tool.ant.HibernateToolTask" classpathref="project.class.path" /> --> <hibernatetool destdir="sql"> <!-- check that directory exists --> <jpaconfiguration persistenceunit="default" /> <classpath> <dirset dir="WebRoot/WEB-INF/classes"> <include name="**/*"/> </dirset> </classpath> <hbm2ddl outputfilename="schemaexport.sql" format="true" export="false" drop="true" /> </hibernatetool> </target> Error message (ant -v): Apache Ant version 1.7.0 compiled on December 13 2006 Buildfile: /home/joe/workspace/bento/ant-generate-ddl.xml parsing buildfile /home/joe/workspace/bento/ant-generate-ddl.xml with URI = file:/home/joe/workspace/bento/ant-generate-ddl.xml Project base dir set to: /home/joe/workspace/bento [antlib:org.apache.tools.ant] Could not load definitions from resource org/apache/tools/ant/antlib.xml. It could not be found. BUILD FAILED /home/joe/workspace/bento/ant-generate-ddl.xml:12: Only antlib URIs can be located from the URI alone,not the URI at org.apache.tools.ant.taskdefs.Definer.execute(Definer.java:216) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:105) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:357) at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:140) at org.eclipse.ant.internal.ui.antsupport.InternalAntRunner.parseBuildFile(InternalAntRunner.java:191) at org.eclipse.ant.internal.ui.antsupport.InternalAntRunner.run(InternalAntRunner.java:400) at org.eclipse.ant.internal.ui.antsupport.InternalAntRunner.main(InternalAntRunner.java:137) Total time: 195 milliseconds

    Read the article

  • Best multi-platform mobile development tool, or use iPhone tools?

    - by Jesse Millikan
    I may be building a mobile app for a client soon. Their primary focus is the iPhone, but my boss would like to be able to target multiple platforms if it's feasible. The app will probably be a large but technically simple business application backed by a web service. So, here's the question as I see it: What is currently the strongest cross-platform mobile development tool that supports iOS? Would you choose it over native development tools? If you choose native, contrast it with a cross-platform tool you've used. In addition, For a project of the type we're expecting, what's the level of effort for your chosen tool versus other tools? What's the actual level of support of the tool for other platforms and their unique look and feel, capabilities, etc.? How thorough is the documentation of the product? How well do you like the development experience itself, e.g. the language, tools, documentation? Is it something you would choose to do long-term? I'll put a bounty out unless I get fantastic answers.

    Read the article

  • Is there a way to split/factor out common parts of Gradle build

    - by Superfilin
    We have several independent builds (each independent build is a multi-project build). The main build scripts become quite big as we have a set of common tasks reused by subprojects as well as there is a lot of repeation between indepedent builds. What we are looking for is: A way to split main build file into smaller files A way to reuse some parts of the build in other independent builds What is the best way to achieve that in Gradle?

    Read the article

  • Build file generated by android broken?

    - by Eno
    I generated a build file for an Android project using the android tool (Im using the build file in Hudson to automate builds). However, in Eclipse the build.xml file is flagged as having an error so my project no longer builds in Eclipse - it says that the default target ("help") doesn't exist (its in one of the Android build files referenced in build.xml and copmmand-line builds work just fine). So what do I need to do to keep Eclipse quiet?

    Read the article

  • TFS 2008 Build and deploy to Inetpub web folder

    - by mattgcon
    I have TFS2008 and have a build running, however, I want to automate the deployment of the build folder and place the build into the inetpub folder it belongs to. I.E.: Run Build After Build, automatically place the new built solution into Inetpub/wwwroot/websitefolder I have tried xcopy, robocopy and synctoy 2.1 and I cannot get any of them to work. Can anyone at all please help me with this dilemna? Thank you in advance

    Read the article

  • Chromium OS Build: How can I use a prebuilt chromium in my OS build?

    - by trees
    I'm attempting to build chromium-os from source and am having some issues integrating the browser into the build. I have a zip of a prebuilt version but I have not been able to find documentation on what to do with it. I've been following the guide that google provides and have just setup the board, but now I am stuck, the directions for building/including chromium in the OS are far to vague. Any help would be greatly appreciated

    Read the article

  • Execute build task in Hudson with root privilages

    - by jensendarren
    I have a build script which executes apt-get and therefore requires root privileges. What is the best way to run this script in Hudson? Currently the only solution I have found that works is to add an entry to the sudoers file for the user hudson like so: hudson ALL=(ALL) NOPASSWD:ALL However, although my build script now runs without error in Hudson, I am not entirely comfortable with this solution. Is there a better way?

    Read the article

  • VMWare tools on Ubuntu Server 10.10 kernel source problem

    - by Hamid Elaosta
    After install and running the vm-ware config, the config needs my kernel headers to compile some modules, ok, so I'll give it them, but it just won't work. It asks for the path of the directory of C header files that match my running kernel. If I uname -r I get 2.6.35-22-generic-pae So I tell it the source path is /lib/modules/2.6.25-22-generic-pae/build/include and it returns "The directory of kernel headers (version @@VMWARE@@ UTS_RELEASE) does not match your running kernel (version 2.6.35-22-generic-pae). ..I'm confused? can anyone offer suggestions please? I installed hte kernel source andh eaders myself using sudo apt-get install linux-headers-$(uname -r)

    Read the article

  • Building Visual Studio Setup Projects with TFS 2010 Team Build

    - by Jakob Ehn
    One of the most common complaints from people starting to use Team Build is that is doesn’t support building Microsoft’s own Setup and Deployment project (*.vdproj). When creating a default build definition that compiles a solution containing a setup project, you’ll get the following warning: The project file "MyProject.vdproj" is not supported by MSBuild and cannot be built.   This is what the problem is all about. MSBuild, that is used for compiling your projects, does not understand the proprietary vdproj format defined by Microsoft quite some time ago. Unfortunately there is no sign that this will change in the near future, in fact the setup projects has barely changed at all since they were introduced. VS 2010 brings no new features or improvements hen it comes to the setup projects. VS 2010 does include a limited version of InstallShield which promises to be more MSBuild friendly and with more or less the same features as VS setup projects. I hope to get a closer look at this installer project type soon. But, how do we go about to build a Visual Studio setup project and produce an MSI as part of a Team Build process? Well, since only one application known to man understands the vdproj projects, we will have to installa copy of Visual Studio on the build server. Sad but true. After doing this, we use the Visual Studio command line interface (devenv) to perform the build. In this post I will show how to do this by using the InvokeProcess activity directly in a build workflow template. You’ll want to run build your setup projects after you have successfully compiled the projects.   Install Visual Studio 2010 on the build server(s)   Open your build process template /remember to branch or copy the xaml file before modifying it!)   Locate the Try to Compile the Project activity   Drop an instance of the InvokeProcess activity from the toolbox onto the designer, after the Run MSBuild for Project activity   Drop an instance of the WriteBuildMessage activity inside the Handle Standard Output section. Set the Importance property to Microsoft.TeamFoundation.Build.Client.BuildMessageImportance.High (NB: This is necessary if you want the output from devenv to show up in the build log when running the build with the default verbosity) Set the Message property to stdOutput   Drop an instance of the WriteBuildError activity to the Handle Error Output section Set the Message property to errOutput   Select the InvokeProcess activity and set the values of the parameters to:     The finished workflow should look like this:     This will generate the MSI files, but they won’t be copied to the drop location. This is because we are using devenv and not MSBuild, so we have to do this explicitly   Drop a Sequence activity somewhere after the Copy to Drop location activity.   Create a variable in the Sequence activity of type IEnumerable<String> and call it GeneratedInstallers   Drop a FindMatchingFiles activity in the sequence activity and set the properties to:     Drop a ForEach<String> activity after the FindMatchingFiles activity. Set the Value property to GeneratedInstallers   Drop an InvokeProcess activity inside the ForEach activity.  FileName: “xcopy.exe” Arguments: String.Format("""{0}"" ""{1}""", item, BuildDetail.DropLocation) The Sequence activity should look like this:     Save the build process template and check it in.   Run the build and verify that the MSI’s is built and copied to the drop location.   Note 1: One of the drawback of using devenv like this in a team build is that since all the output from the default compilations is placed in the Binaries folder, the outputs is not avaialable when devenv is invoked, which causes the whole solution to rebuild again. In TFS 2008, this was pretty simple to fix by using the CustomizableOutDir property. In TFS 2010, the same feature is not avaialble. Jim Lamb blogged about this recently, have a look at it if you have a problem with this: http://blogs.msdn.com/jimlamb/archive/2010/04/13/customizableoutdir-in-tfs-2010.aspx   Note 2: Although the above solution works, a better approach is to wrap this in a custom activity that you can use in your builds. I will come back to this in a future post.

    Read the article

  • Dependency Replication with TFS 2010 Build

    - by Jakob Ehn
    Some time ago, I wrote a post about how to implement dependency replication using TFS 2008 Build. We use this for Library builds, where we set up a build definition for a common library, and have the build check the resulting assemblies back into source control. The folder is then branched to the applications that need to reference the common library. See the above post for more details. Of course, we have reimplemented this feature in TFS 2010 Build, which results in a much nicer experience for the developer who wants to setup a new library build. Here is how it looks: There is a separate build process template for library builds registered in all team projects The following properties are used to configure the library build: Deploy Folder in Source Control is the server path where the assemblies should be checked in DeploymentFiles is a list of files and/or extensions to what files to check in. Default here is *.dll;*.pdb which means that all assemblies and debug symbols will be checked in. We can also type for example CommonLibrary.*;SomeOtherAssembly.dll in order to exclude other assemblies You can also see that we are versioning the assemblies as part of the build. This is important, since the resulting assemblies will be deployed together with the referencing application.   When the build executes, it will see of the matching assemblies exist in source control, if not, it will add the files automatically:   After the build has finished, we can see in the history of the TestDeploy folder that the build service account has in fact checked in a new version: Nice!   The implementation of the library build process template is not very complicated, it is a combination of customization of the build process template and some custom activities. We use the generic TFActivity (http://geekswithblogs.net/jakob/archive/2010/11/03/performing-checkins-in-tfs-2010-build.aspx) to check in and out files, but for the part that checks if a file exists and adds it to source control, it was easier to do this in a custom activity:   public sealed class AddFilesToSourceControl : BaseCodeActivity { // Files to add to source control [RequiredArgument] public InArgument<IEnumerable<string>> Files { get; set; } [RequiredArgument] public InArgument<Workspace> Workspace { get; set; } // If your activity returns a value, derive from CodeActivity<TResult> // and return the value from the Execute method. protected override void Execute(CodeActivityContext context) { foreach (var file in Files.Get(context)) { if (!File.Exists(file)) { throw new ApplicationException("Could not locate " + file); } var ws = this.Workspace.Get(context); string serverPath = ws.TryGetServerItemForLocalItem(file); if( !String.IsNullOrEmpty(serverPath)) { if (!ws.VersionControlServer.ServerItemExists(serverPath, ItemType.File)) { TrackMessage(context, "Adding file " + file); ws.PendAdd(file); } else { TrackMessage(context, "File " + file + " already exists in source control"); } } else { TrackMessage(context, "No server path for " + file); } } } } This build template is a very nice tool that makes it easy to do dependency replication with TFS 2010. Next, I will add funtionality for automatically merging the assemblies (using ILMerge) as part of the build, we do this to keep the number of references to a minimum.

    Read the article

  • REST tools support for development and testing

    - by nzpcmad
    There is a similar question here but it only covers some of the issues below. We have a client who requires web services using REST. We have tons of experience using SOAP and over time have gathered together a really good set of tools for SOAP development and testing e.g. soapUI Eclipse plugins wsdl2java WSStudio By "tools" I mean a product "out of the box" that we can start using. I'm not talking about cutting code to "roll our own" using Ajax or whatever. The tool set for REST doesn't seem to be nearly as mature? What tools are out there (we use C# and Java mainly) ? Do the tools handle GET, POST, PUT, and DELETE? Is there a decent Eclipse plugin? Is there a decent client testing application like WSStudio where you point the tool to the WSDL and it generates a proxy on the fly with the appropriate methods and inputs and you simple type the data in? Are there any good package monitoring tools that allow you to look at the data? (I'm not thinking about sniffers like Wireshark here but rather things like soapUI that allow you to see the request / response) ?

    Read the article

  • Can't remember the website for downloading common tools after reinstall

    - by JB
    I remember a while back finding a website for downloading lots of common tools in one go after a system reinstall. It had a dark background and checkboxes for selecting the tools (browsers, editors, readers, im clients, etc.). After selecting the tools it downloaded a file which went away, downloaded everything you'd selected and then performed an install on each of the different apps. Does anyone have any idea what I'm talking about?

    Read the article

  • Javascript Buildmanagement - "Must have" tools?

    - by lajuette
    Are there any must have tools for Java Script (RIA) development like maven, jUnit, Emma, link4j etc. for Javascript? What is the best way to set up a continous integration system for a bigger application or framework? How do projects like jQuery test their code? How to manage dependencies and different project configurations? tools i know so far: javascript-maven-tools (is maven the right choice?) jslint yuicompressor sprockets (found it 5 mins ago) jsunit selenium

    Read the article

  • why doesn't everyone use RAD tools?

    - by Sorskoot
    RAD tools like Clarion and WinDev claim to be 10x to 20x times faster in development and users of these tools claim the same. If that is thru, why are there so few people using these tools? if an application is made in 40 hours instead of 400 you would make a lot more money, right?

    Read the article

  • How do I set the Eclipse build path and class path from an Ant build file?

    - by Nels Beckman
    Hey folks, There's a lot of discussion about Ant and Eclipse, but no previously answered seems to help me. Here's the deal: I am trying to build a Java program that compiles successfully with Ant from the command-line. (To confuse matters further, the program I am attempting to compile is Ant itself.) What I really want to do is to bring this project into Eclipse and have it compile in Eclipse such that the type bindings and variable bindings (nomenclature from Eclipse JDT) are correctly resolved. I need this because I need to run a static analysis on the code that is built on top of Eclipse JDT. The normal way I bring a Java project into Eclipse so that Eclipse will build it and resolve all the bindings is to just import the source directories into a Java project, and then tell it to use the src/main/ directory as a "source directory." Unfortunately, doing that with Ant causes the build to fail with numerous compile errors. It seems to me that the Ant build file is setting up the class path and build path correctly (possibly by excluding certain source files) and Eclipse does not have this information. Is there any way to take the class path & build path information embedded in an Ant build file, and given that information to Eclipse to put in its .project and .classpath files? I've tried, creating a new project from an existing build file (an option in the File menu) but this does not help. The project still has the same compile errors. Thanks, Nels

    Read the article

  • Xcode "Build and Archive" from command line

    - by Dan Fabulich
    Xcode 3.2 provides an awesome new feature under the Build menu, "Build and Archive" which generates an .ipa file suitable for Ad Hoc distribution. You can also open the Organizer, go to "Archived Applications," and "Submit Application to iTunesConnect." Is there a way to use "Build and Archive" from the command line (as part of a build script)? I'd assume that xcodebuild would be involved somehow, but the man page doesn't seem to say anything about this. UPDATE Michael Grinich requested clarification; here's what exactly you can't do with command-line builds, features you can ONLY do with Xcode's Organizer after you "Build and Archive." You can click "Share Application..." to share your IPA with beta testers. As Guillaume points out below, due to some Xcode magic, this IPA file does not require a separately distributed .mobileprovision file that beta testers need to install; that's magical. No command-line script can do it. For example, Arrix's script (submitted May 1) does not meet that requirement. More importantly, after you've beta tested a build, you can click "Submit Application to iTunes Connect" to submit that EXACT same build to Apple, the very binary you tested, without rebuilding it. That's impossible from the command line, because signing the app is part of the build process; you can sign bits for Ad Hoc beta testing OR you can sign them for submission to the App Store, but not both. No IPA built on the command-line can be beta tested on phones and then submitted directly to Apple. I'd love for someone to come along and prove me wrong: both of these features work great in the Xcode GUI and cannot be replicated from the command line.

    Read the article

  • Team Foundation Server (TFS) Team Build Custom Activity C# Code for Assembly Stamping

    - by Bob Hardister
    For the full context and guidance on how to develop and implement a custom activity in Team Build see the Microsoft Visual Studio Rangers Team Foundation Build Customization Guide V.1 at http://vsarbuildguide.codeplex.com/ There are many ways to stamp or set the version number of your assemblies. This approach is based on the build number.   namespace CustomActivities { using System; using System.Activities; using System.IO; using System.Text.RegularExpressions; using Microsoft.TeamFoundation.Build.Client; [BuildActivity(HostEnvironmentOption.Agent)] public sealed class VersionAssemblies : CodeActivity { /// <summary> /// AssemblyInfoFileMask /// </summary> [RequiredArgument] public InArgument<string> AssemblyInfoFileMask { get; set; } /// <summary> /// SourcesDirectory /// </summary> [RequiredArgument] public InArgument<string> SourcesDirectory { get; set; } /// <summary> /// BuildNumber /// </summary> [RequiredArgument] public InArgument<string> BuildNumber { get; set; } /// <summary> /// BuildDirectory /// </summary> [RequiredArgument] public InArgument<string> BuildDirectory { get; set; } /// <summary> /// Publishes field values to the build report /// </summary> public OutArgument<string> DiagnosticTextOut { get; set; } // If your activity returns a value, derive from CodeActivity<TResult> and return the value from the Execute method. protected override void Execute(CodeActivityContext context) { // Obtain the runtime value of the input arguments string sourcesDirectory = context.GetValue(this.SourcesDirectory); string assemblyInfoFileMask = context.GetValue(this.AssemblyInfoFileMask); string buildNumber = context.GetValue(this.BuildNumber); string buildDirectory = context.GetValue(this.BuildDirectory); // ** Determine the version number values ** // Note: the format used here is: major.secondary.maintenance.build // ----------------------------------------------------------------- // Obtain the build definition name int nameStart = buildDirectory.LastIndexOf(@"\") + 1; string buildDefinitionName = buildDirectory.Substring(nameStart); // Set the primary.secondary.maintenance values // NOTE: these are hard coded in this example, but could be sourced from a file or parsed from a build definition name that includes them string p = "1"; string s = "5"; string m = "2"; // Initialize the build number string b; string na = "0"; // used for Assembly and Product Version instead of build number (see versioning best practices: **TBD reference) // Set qualifying product version information string productInfo = "RC2"; // Obtain the build increment number from the build number // NOTE: this code assumes the default build definition name format int buildIncrementNumberDelimterIndex = buildNumber.LastIndexOf("."); b = buildNumber.Substring(buildIncrementNumberDelimterIndex + 1); // Convert version to integer values int pVer = Convert.ToInt16(p); int sVer = Convert.ToInt16(s); int mVer = Convert.ToInt16(m); int bNum = Convert.ToInt16(b); int naNum = Convert.ToInt16(na); // ** Get all AssemblyInfo files and stamp them ** // Note: the mapping of AssemblyInfo.cs attributes to assembly display properties are as follows: // - AssemblyVersion = Assembly Version - used for the assembly version (does not change unless p, s or m values are changed) // - AssemblyFileVersion = File Version - used for the file version (changes with every build) // - AssemblyInformationalVersion = Product Version - used for the product version (can include additional version information) // ------------------------------------------------------------------------------------------------------------------------------------------------ Version assemblyVersion = new Version(pVer, sVer, mVer, naNum); Version newAssemblyFileVersion = new Version(pVer, sVer, mVer, bNum); Version productVersion = new Version(pVer, sVer, mVer); // Setup diagnostic fields int numberOfReplacements = 0; string addedAssemblyInformationalAttribute = "No"; // Enumerate over the assemblyInfo version attributes foreach (string attribute in new[] { "AssemblyVersion", "AssemblyFileVersion", "AssemblyInformationalVersion" }) { // Define the regular expression to find in each and every Assemblyinfo.cs files (which is for example 'AssemblyVersion("1.0.0.0")' ) Regex regex = new Regex(attribute + @"\(""\d+\.\d+\.\d+\.\d+""\)"); foreach (string file in Directory.EnumerateFiles(sourcesDirectory, assemblyInfoFileMask, SearchOption.AllDirectories)) { string text = File.ReadAllText(file); // Read the text from the AssemblyInfo file // If the AsemblyInformationalVersion attribute is not in the file, add it as the last line of the file // Note: by default the AssemblyInfo.cs files will not contain the AssemblyInformationalVersion attribute if (!text.Contains("[assembly: AssemblyInformationalVersion(\"")) { string lastLine = Environment.NewLine + "[assembly: AssemblyInformationalVersion(\"1.0.0.0\")]"; text = text + lastLine; addedAssemblyInformationalAttribute = "Yes"; } // Search for the expression Match match = regex.Match(text); if (match.Success) { // Get file attributes FileAttributes fileAttributes = File.GetAttributes(file); // Set file to read only File.SetAttributes(file, fileAttributes & ~FileAttributes.ReadOnly); // Insert AssemblyInformationalVersion attribute into the file text if does not already exist string newText = string.Empty; if (attribute == "AssemblyVersion") { newText = regex.Replace(text, attribute + "(\"" + assemblyVersion + "\")"); numberOfReplacements++; } if (attribute == "AssemblyFileVersion") { newText = regex.Replace(text, attribute + "(\"" + newAssemblyFileVersion + "\")"); numberOfReplacements++; } if (attribute == "AssemblyInformationalVersion") { newText = regex.Replace(text, attribute + "(\"" + productVersion + " " + productInfo + "\")"); numberOfReplacements++; } // Publish diagnostics to build report (diagnostic verbosity only) context.SetValue(this.DiagnosticTextOut, " Added AssemblyInformational Attribute: " + addedAssemblyInformationalAttribute + " Number of replacements: " + numberOfReplacements + " Build number: " + buildNumber + " Build directory: " + buildDirectory + " Build definition name: " + buildDefinitionName + " Assembly version: " + assemblyVersion + " New file version: " + newAssemblyFileVersion + " Product version: " + productVersion + " AssemblyInfo.cs Text Last Stamped: " + newText); // Write the new text in the AssemblyInfo file File.WriteAllText(file, newText); // restore the file's original attributes File.SetAttributes(file, fileAttributes); } } } } } }

    Read the article

  • JPRT: A Build & Test System

    - by kto
    DRAFT A while back I did a little blogging on a system called JPRT, the hardware used and a summary on my java.net weblog. This is an update on the JPRT system. JPRT ("JDK Putback Reliablity Testing", but ignore what the letters stand for, I change what they mean every day, just to annoy people :\^) is a build and test system for the JDK, or any source base that has been configured for JPRT. As I mentioned in the above blog, JPRT is a major modification to a system called PRT that the HotSpot VM development team has been using for many years, very successfully I might add. Keeping the source base always buildable and reliable is the first step in the 12 steps of dealing with your product quality... or was the 12 steps from Alcoholics Anonymous... oh well, anyway, it's the first of many steps. ;\^) Internally when we make changes to any part of the JDK, there are certain procedures we are required to perform prior to any putback or commit of the changes. The procedures often vary from team to team, depending on many factors, such as whether native code is changed, or if the change could impact other areas of the JDK. But a common requirement is a verification that the source base with the changes (and merged with the very latest source base) will build on many of not all 8 platforms, and a full 'from scratch' build, not an incremental build, which can hide full build problems. The testing needed varies, depending on what has been changed. Anyone that was worked on a project where multiple engineers or groups are submitting changes to a shared source base knows how disruptive a 'bad commit' can be on everyone. How many times have you heard: "So And So made a bunch of changes and now I can't build!". But multiply the number of platforms by 8, and make all the platforms old and antiquated OS versions with bizarre system setup requirements and you have a pretty complicated situation (see http://download.java.net/jdk6/docs/build/README-builds.html). We don't tolerate bad commits, but our enforcement is somewhat lacking, usually it's an 'after the fact' correction. Luckily the Source Code Management system we use (another antique called TeamWare) allows for a tree of repositories and 'bad commits' are usually isolated to a small team. Punishment to date has been pretty drastic, the Queen of Hearts in 'Alice in Wonderland' said 'Off With Their Heads', well trust me, you don't want to be the engineer doing a 'bad commit' to the JDK. With JPRT, hopefully this will become a thing of the past, not that we have had many 'bad commits' to the master source base, in general the teams doing the integrations know how important their jobs are and they rarely make 'bad commits'. So for these JDK integrators, maybe what JPRT does is keep them from chewing their finger nails at night. ;\^) Over the years each of the teams have accumulated sets of machines they use for building, or they use some of the shared machines available to all of us. But the hunt for build machines is just part of the job, or has been. And although the issues with consistency of the build machines hasn't been a horrible problem, often you never know if the Solaris build machine you are using has all the right patches, or if the Linux machine has the right service pack, or if the Windows machine has it's latest updates. Hopefully the JPRT system can solve this problem. When we ship the binary JDK bits, it is SO very important that the build machines are correct, and we know how difficult it is to get them setup. Sure, if you need to debug a JDK problem that only shows up on Windows XP or Solaris 9, you'll still need to hunt down a machine, but not as a regular everyday occurance. I'm a big fan of a regular nightly build and test system, constantly verifying that a source base builds and tests out. There are many examples of automated build/tests, some that trigger on any change to the source base, some that just run every night. Some provide a protection gateway to the 'golden' source base which only gets changes that the nightly process has verified are good. The JPRT (and PRT) system is meant to guard the source base before anything is sent to it, guarding all source bases from the evil developer, well maybe 'evil' isn't the right word, I haven't met many 'evil' developers, more like 'error prone' developers. ;\^) Humm, come to think about it, I may be one from time to time. :\^{ But the point is that by spreading the build up over a set of machines, and getting the turnaround down to under an hour, it becomes realistic to completely build on all platforms and test it, on every putback. We have the technology, we can build and rebuild and rebuild, and it will be better than it was before, ha ha... Anybody remember the Six Million Dollar Man? Man, I gotta get out more often.. Anyway, now the nightly build and test can become a 'fetch the latest JPRT build bits' and start extensive testing (the testing not done by JPRT, or the platforms not tested by JPRT). Is it Open Source? No, not yet. Would you like to be? Let me know. Or is it more important that you have the ability to use such a system for JDK changes? So enough blabbering on about this JPRT system, tell me what you think. And let me know if you want to hear more about it or not. Stay tuned for the next episode, same Bloody Bat time, same Bloody Bat channel. ;\^) -kto

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >