Search Results

Search found 24784 results on 992 pages for 'process integration packs'.

Page 18/992 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Execute C++ exe from C# form using Process.start()

    - by Dan
    Hi, I'm trying to create a C# form app that will allow me to use all of my previous C++ programs from one central program. I'm able to open the exes with Process.start, however it does not compile the code correctly. Example code: Process.Start("C:\\Documents and Settings\\dan\\Desktop\\test.exe"); This will bring up the console and act like it's running, but it does not run like when I normally compile out of the C++ editor. Is there a startinfo variable I need to set to signify that it's a c++ program or something along that line? Also, is there any way to execute a C++ program using process.start that will allow me to pass it variables through the command line via argc and argv? Thanks

    Read the article

  • Failed to Kill Process in SQL 2008

    - by Andrea.Ko
    I have a process with the following information, and i execute the kill process to kill this id, and it return me "Only user processes can be killed." SPID:11 Status:BACKGROUND Login:sa HostName: . BlkBy: . DBName: SAFEMIG Command:CHECKPOINT Normally, all the session to login to this server, it should have a HostName which display our PC name, but this connection is with a dot, so not sure who is executing what process that have this connection. I execute "dbcc inputbuffer(11)" It return me"EventType= No Event, Parameters = 0 and EventInfo=Null" Appreciate for any help\advice on this problem!

    Read the article

  • How to integrate junit/pmd/findbugs report into hudson build email?

    - by fei
    my team is looking into using hudson as our continuous integration software, 1 problem that we try to figure out is to integrate the reports of junit/pmd/findbugs etc into the build email that get sent to the team. the graph/reports on the dashboard are nice and all, but people usually want to just read the email and not clicking the links. i tried to use the ext-email plugin, but that doesn't provide much help related to this. is there any way i can get those info into the build email? Thanks!

    Read the article

  • Programmatic resource monitoring per process in Linux

    - by tuxx
    Hi, I want to know if there is an efficient solution to monitor a process resource consumption (cpu, memory, network bandwidth) in Linux. I want to write a daemon in C++ that does this monitoring for some given PIDs. From what I know, the classic solution is to periodically read the information from /proc, but this doesn't seem the most efficient way (it involves many system calls). For example to monitor the memory usage every second for 50 processes, I have to open, read and close 50 files (that means 150 system calls) every second from /proc. Not to mention the parsing involved when reading these files. Another problem is the network bandwidth consumption: this cannot be easily computed for each process I want to monitor. The solution adopted by NetHogs involves a pretty high overhead in my opinion: it captures and analyzes every packet using libpcap, then for each packet the local port is determined and searched in /proc to find the corresponding process. Do you know if there are more efficient alternatives to these methods presented or any libraries that deal with this problems?

    Read the article

  • Java respawn process

    - by Bart van Heukelom
    I'm making an editor-like program. If the user chooses File-Open in the main window I want to start a new copy of the editor process with the chosen filename as an argument. However, for that I need to know what command was used to start the first process: java -jar myapp.jar blabalsomearguments // --- need this information Open File (fileUrl) exec("java -jar myapp.jar blabalsomearguments fileUrl"); I'm not looking for an in-process solution, I've already implemented that. I'd like to have the benefits that seperate processes bring.

    Read the article

  • Junit test that creates other tests

    - by Benju
    Normally I would have one junit test that shows up in my integration server of choice as one test that passes or fails (in this case I use teamcity). What I need for this specific test is the ability to loop through a directory structure testing that our data files can all be parsed without throwing an exception. Because we have 30,000+ files that that 1-5 seconds each to parse this test will be run in its own suite. The problem is that I need a way to have one piece of code run as one junit test per file so that if 12 files out of 30,000 files fail I can see which 12 failed not just that one failed, threw a runtimeexception and stopped the test. I realize that this is not a true "unit" test way of doing things but this simulation is very important to make sure that our content providers are kept in check and do not check in invalid files. Any suggestions?

    Read the article

  • Problem with System.Diagnostics.Process RedirectStandardOutput to appear in Winforms Textbox in real

    - by Jonathan Websdale
    I'm having problems with the redirected output from a console application being presented in a Winforms textbox in real-time. The messages are being produced line by line however as soon as interaction with the Form is called for, nothing appears to be displayed. Following the many examples on both Stackoverflow and other forums, I can't seem to get the redirected output from the process to display in the textbox on the form until the process completes. By adding debug lines to the 'stringWriter_StringWritten' method and writing the redirected messages to the debug window I can see the messages arriving during the running of the process but these messages will not appear on the form's textbox until the process completes. Grateful for any advice on this. Here's an extract of the code public partial class RunExternalProcess : Form { private static int numberOutputLines = 0; private static MyStringWriter stringWriter; public RunExternalProcess() { InitializeComponent(); // Create the output message writter RunExternalProcess.stringWriter = new MyStringWriter(); stringWriter.StringWritten += new EventHandler(stringWriter_StringWritten); System.Diagnostics.ProcessStartInfo startInfo = new System.Diagnostics.ProcessStartInfo(); startInfo.FileName = "myCommandLineApp.exe"; startInfo.UseShellExecute = false; startInfo.RedirectStandardOutput = true; startInfo.CreateNoWindow = true; startInfo.WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden; using (var pProcess = new System.Diagnostics.Process()) { pProcess.StartInfo = startInfo; pProcess.OutputDataReceived += new System.Diagnostics.DataReceivedEventHandler(RunExternalProcess.Process_OutputDataReceived); pProcess.EnableRaisingEvents = true; try { pProcess.Start(); pProcess.BeginOutputReadLine(); pProcess.BeginErrorReadLine(); pProcess.WaitForExit(); } catch (Exception ex) { MessageBox.Show(ex.ToString()); } finally { pProcess.OutputDataReceived -= new System.Diagnostics.DataReceivedEventHandler(RunExternalProcess.Process_OutputDataReceived); } } } private static void Process_OutputDataReceived(object sender, System.Diagnostics.DataReceivedEventArgs e) { if (!string.IsNullOrEmpty(e.Data)) { RunExternalProcess.OutputMessage(e.Data); } } private static void OutputMessage(string message) { RunExternalProcess.stringWriter.WriteLine("[" + RunExternalProcess.numberOutputLines++.ToString() + "] - " + message); } private void stringWriter_StringWritten(object sender, EventArgs e) { System.Diagnostics.Debug.WriteLine(((MyStringWriter)sender).GetStringBuilder().ToString()); SetProgressTextBox(((MyStringWriter)sender).GetStringBuilder().ToString()); } private delegate void SetProgressTextBoxCallback(string text); private void SetProgressTextBox(string text) { if (this.ProgressTextBox.InvokeRequired) { SetProgressTextBoxCallback callback = new SetProgressTextBoxCallback(SetProgressTextBox); this.BeginInvoke(callback, new object[] { text }); } else { this.ProgressTextBox.Text = text; this.ProgressTextBox.Select(this.ProgressTextBox.Text.Length, 0); this.ProgressTextBox.ScrollToCaret(); } } } public class MyStringWriter : System.IO.StringWriter { // Define the event. public event EventHandler StringWritten; public MyStringWriter() : base() { } public MyStringWriter(StringBuilder sb) : base(sb) { } public MyStringWriter(StringBuilder sb, IFormatProvider formatProvider) : base(sb, formatProvider) { } public MyStringWriter(IFormatProvider formatProvider) : base(formatProvider) { } protected virtual void OnStringWritten() { if (StringWritten != null) { StringWritten(this, EventArgs.Empty); } } public override void Write(char value) { base.Write(value); this.OnStringWritten(); } public override void Write(char[] buffer, int index, int count) { base.Write(buffer, index, count); this.OnStringWritten(); } public override void Write(string value) { base.Write(value); this.OnStringWritten(); } }

    Read the article

  • Ameristar Wins with Oracle GoldenGate’s Heterogeneous Real-Time Data Integration

    - by Irem Radzik
    Today we announced a press release about another successful project with Oracle GoldenGate. This time at Ameristar. Ameristar is a casino gaming company and needed a single data integration solution to connect multiple heterogeneous systems to its Teradata data warehouse. The project involves integration of Ameristar’s promotional and gaming data from 14 data sources across its 7 casino hotel properties in real time into a central Teradata data warehouse. The source systems include the Aristocrat gaming and MGT promotional management platforms running on Microsoft SQL Server 2000 databases. As you can notice, there was no Oracle Database involved in this project, but Ameristar’s IT leadership knew that  GoldenGate’s strong heterogeneous and real-time data integration capabilities is the right technology for their data warehousing project. With GoldenGate Ameristar was able to reduce data latency to the enterprise data warehouse, and use this real-time customer information for marketing teams in improving overall customer experience. Ameristar customers receive more targeted and timely campaign offers, and the company has more up-to-date visibility into financial metrics of the company. One other key benefit the company experienced with GoldenGate is in operational costs. The previous data capture solution Ameristar used was trigger based and required a lot of effort to manage. They needed dedicated IT staff to maintain it. With GoldenGate, the solution runs seamlessly without needing a fully-dedicated staff, giving the IT team at Ameristar more resources for their other IT projects. If you want to learn more about GoldenGate and the latest features for Oracle Database and non-Oracle databases, please watch our on demand webcast about Oracle GoldenGate 11g Release 2.

    Read the article

  • Oracle SOA Suite customer panel: Successful Application Integration & SOA Projects

    - by Simone Geib
    At the recent SOA Suite customer panel, Roger Brown from UNS Energy, Fabio Ravagni from Cencosud and Paras Jain from Cisco discussed their recent SOA Suite implementations, business drivers and challenges, architecture and lessons learned. Roger started by describing how UNS redesigned their internet portal to improve their customer experience and reduce manual steps in their business processes. Through the use of Oracle Service Bus, Oracle BPEL Process Manager and Oracle Business Activity Monitoring, they provided more self-service functionality, automated their business processes and increased the use of their web site by 12.98% for number of visits and 33.58% for average visit duration. The screenshot below shows the UNS architecture: > Next Fabio described the challenges Cencosud faced through continuous expansion of their business, different standards and levels of expertise and large volumes of information. By introducing Oracle SOA Suite, Oracle Data Integrator and Oracle Enterprise Repository, and with the help of Oracle Consulting, they significantly simplified their integration model, reduced their maintenance effort and increased their integration governance. The picture below shows the implemented solution with so far more than 400 services in production and more than 20 ongoing projects, which will make use of the new integration platform. > Last, but not least, Paras discussed the challenges the Webex division of Cisco faced with a highly manual service fulfillment process, multiple data sources and the resulting large room for errror and delay in customer time-to-service. Through a redesign of their order fulfillment process and the introduction of Oracle SOA Suite (see below), they significantly improved their SLAs, eliminated duplicate orders, provided higher visibility into the order process and aligned business and IT. For more information about Oracle OpenWorld SOA & BPM Session, please see the Focus on SOA and BPM document

    Read the article

  • Connecting Clinical and Administrative Processes: Oracle SOA Suite for Healthcare Integration

    - by Mala Ramakrishnan
    One of the biggest IT challenges facing today’s health care industry is the difficulty finding reliable, secure, and cost-effective ways to exchange information. Payers and providers need versatile platforms for enterprise-wide information sharing. Clinicians require accurate information to provide quality care to patients while administrators need integrated information for all facets of the business operation. Both sides of the organization must be able to access information from research and development systems, practice management systems, claims systems, financial systems, and many others. Externally, these organizations must share claims data, patient records, pharmaceutical data, lab reports, and diagnostic information among third party entities—all while complying with emerging standards for formatting, processing, and storing electronic health records (EHR). Service-oriented architecture (SOA) enables developers to integrate many types of software applications, databases and computing platforms within a particular health network as well as with community, state, and national health information exchanges. The Oracle SOA Suite for healthcare integration is designed to provide healthcare organizations with comprehensive integration capabilities within a unified middleware platform, as well as with healthcare libraries and templates for streamlining healthcare IT projects. It reduces the need for specialized skills and enforces an enterprise-wide view of critical healthcare data.  Here is a new white paper that details more about this offering: Oracle SOA Suite for Healthcare Integration

    Read the article

  • Hosted EBS 11i Integration Repository Temporarily Offline

    - by Steven Chan (Oracle Development)
    Most developers know that they can integrate their external applications with the E-Business Suite via the business service interfaces and SOA service endpoints documented in the E-Business Suite's Integration Repository.  This is shipped as part of EBS 12.  Until recently, it was provided as a hosted environment on the Oracle.com domain for EBS 11i. Unfortunately, we identified some standards-related issues in the process of switching from the existing server that hosts the EBS 11i environment to a new one, notably in the area of accessibility. Some of those issues will require coding changes to resolve.  Given our focus on EBS 12.2 right now, it may take some time to prioritize this relative to our other existing commitments. In the meantime, we are required to suspend access to the EBS 11i Integration Repository.  I don't have a firm schedule for getting this back online yet, but you're welcome to monitor or subscribe to this blog. I'll post updates here as soon as soon as they're available.    Related Articles Integration Repository for the E-Business Suite New Whitepaper: Primer on Integrating with EBS 12 with Other Applications

    Read the article

  • documentBuilder: The process cannot access the file because it is being used by another process

    - by st mnmn
    I am using DocumentBuilder (of openXML api), to those who doesn't know the documentBuilder I'll give a short explanation: it has a function 'BuildDocument' which gets list of sources (each source contains wmldocument), and string of fileName to save to. public static void BuildDocument(List<Source> sources, string fileName) the purpose of this function is to build one word docx which contains all the sources. it merges some docs to one. at the end of its functionality it saves the doc using: File.WriteAllBytes(...) but when I run my project on the server I keep getting the error: "The process cannot access the file because it is being used by another process." few times it works ok. and in the visualStudio it also works without errors. what can be the problem?

    Read the article

  • Get Process ID of Program Started with C# Process.Start

    - by ThaKidd
    Thanks in advance of all of your help! I am currently developing a program in C# 2010 that launches PLink (Putty) to create a SSH Tunnel. I am trying to make the program able to keep track of each tunnel that is open so a user may terminate those instances that are no longer needed. I am currently using System.Diagnostics.Process.Start to run PLink (currently Putty being used). I need to determine the PID of each plink program when it is launched so a user can terminate it at will. Question is how to do that and am I using the right .Net name space or is there something better? Code snippet: private void btnSSHTest_Click(object sender, EventArgs e) { String puttyConString; puttyConString = "-ssh -P " + cboSSHVersion.SelectedText + " -" + txtSSHPort.Text + " -pw " + txtSSHPassword.Text + " " + txtSSHUsername.Text + "@" + txtSSHHostname.Text; Process.Start("C:\\Program Files (x86)\\Putty\\putty.exe", puttyConString); }

    Read the article

  • how to controller (start/kill) a background process (server app) in ruby

    - by rubiii
    hey guys, i'm trying to set up a server for integration tests (specs actually) via ruby and can't figure out how to control the process. so, what i'm trying to do is: run a rake task for my gem that executes the integration specs the task needs to first start a server (i use webrick) and then run the specs after executing the specs it should kill the webrick so i'm not left with some unused background process webrick is not a requirement, but it's included in the ruby standard library so being able to use it would be great. hope anyone is able to help! ps. i'm running on linux, so having this work for windows is not my main priority (right now).

    Read the article

  • Background process in linux using Ruby on Rails

    - by Salil
    Hi All, I want to do some process such as sending emails or using ffmpeg commands in backgound as it takes to much time. I want it should run in a background. I am using Fedora 10. Also can i check whether my background process is running successfully or not . is it posssible?if yes what would be the steps i should follow.Any help is appreciated. Thanks in Advance.

    Read the article

  • Buisness Rule and Process Management?

    - by elgcom
    After some searching in google and wikipedia, I still can not get a clear image about the "difference" between BRMS (Business Rule Management System) and BPM (Business process management)/workflow system. can those two concepts do the same thing from each other? (theoretically) A "rule" can be modeled as a "process" as well. isn't it?

    Read the article

  • How to automaticly close process that uses more that specified amount of memory on windows

    - by SINTER
    How to automatically close process that uses more that specified amount of memory on windows. Is it possible to specify some amount of memory (for example 1MB) and to run some executable file with those parameters? If process tries to allocate more than that amount of memory it should close and return some error value. Is there an easy way to do something like that on windows? Excuse my English.

    Read the article

  • Version Assemblies with TFS 2010 Continuous Integration

    - by Steve Michelotti
    When I first heard that TFS 2010 had moved to Workflow Foundation for Team Build, I was *extremely* skeptical. I’ve loved MSBuild and didn’t quite understand the reasons for this change. In fact, given that I’ve been exclusively using Cruise Control for Continuous Integration (CI) for the last 5+ years of my career, I was skeptical of TFS for CI in general. However, after going through the learning process for TFS 2010 recently, I’m starting to become a believer. I’m also starting to see some of the benefits with Workflow Foundation for the overall processing because it gives you constructs not available in MSBuild such as parallel tasks, better control flow constructs, and a slightly better customization story. The first customization I had to make to the build process was to version the assemblies of my solution. This is not new. In fact, I’d recommend reading Mike Fourie’s well known post on Versioning Code in TFS before you get started. This post describes several foundational aspects of versioning assemblies regardless of your version of TFS. The main points are: 1) don’t use source control operations for your version file, 2) use a schema like <Major>.<Minor>.<IncrementalNumber>.0, and 3) do not keep AssemblyVersion and AssemblyFileVersion in sync. To do this in TFS 2010, the best post I’ve found has been Jim Lamb’s post of building a custom TFS 2010 workflow activity. Overall, this post is excellent but the primary issue I have with it is that the assembly version numbers produced are based in a date and look like this: “2010.5.15.1”. This is definitely not what I want. I want to be able to communicate to the developers and stakeholders that we are producing the “1.1 release” or “1.2 release” – which would have an assembly version number of “1.1.317.0” for example. In this post, I’ll walk through the process of customizing the assembly version number based on this method – customizing the concepts in Lamb’s post to suit my needs. I’ll also be combining this with the concepts of Fourie’s post – particularly with regards to the standards around how to version the assemblies. The first thing I’ll do is add a file called SolutionAssemblyVersionInfo.cs to the root of my solution that looks like this: 1: using System; 2: using System.Reflection; 3: [assembly: AssemblyVersion("1.1.0.0")] 4: [assembly: AssemblyFileVersion("1.1.0.0")] I’ll then add that file as a Visual Studio link file to each project in my solution by right-clicking the project, “Add – Existing Item…” then when I click the SolutionAssemblyVersionInfo.cs file, making sure I “Add As Link”: Now the Solution Explorer will show our file. We can see that it’s a “link” file because of the black arrow in the icon within all our projects. Of course you’ll need to remove the AssemblyVersion and AssemblyFileVersion attributes from the AssemblyInfo.cs files to avoid the duplicate attributes since they now leave in the SolutionAssemblyVersionInfo.cs file. This is an extremely common technique so that all the projects in our solution can be versioned as a unit. At this point, we’re ready to write our custom activity. The primary consideration is that I want the developer and/or tech lead to be able to easily be in control of the Major.Minor and then I want the CI process to add the third number with a unique incremental number. We’ll leave the fourth position always “0” for now – it’s held in reserve in case the day ever comes where we need to do an emergency patch to Production based on a branched version.   Writing the Custom Workflow Activity Similar to Lamb’s post, I’m going to write two custom workflow activities. The “outer” activity (a xaml activity) will be pretty straight forward. It will check if the solution version file exists in the solution root and, if so, delegate the replacement of version to the AssemblyVersionInfo activity which is a CodeActivity highlighted in red below:   Notice that the arguments of this activity are the “solutionVersionFile” and “tfsBuildNumber” which will be passed in. The tfsBuildNumber passed in will look something like this: “CI_MyApplication.4” and we’ll need to grab the “4” (i.e., the incremental revision number) and put that in the third position. Then we’ll need to honor whatever was specified for Major.Minor in the SolutionAssemblyVersionInfo.cs file. For example, if the SolutionAssemblyVersionInfo.cs file had “1.1.0.0” for the AssemblyVersion (as shown in the first code block near the beginning of this post), then we want to resulting file to have “1.1.4.0”. Before we do anything, let’s put together a unit test for all this so we can know if we get it right: 1: [TestMethod] 2: public void Assembly_version_should_be_parsed_correctly_from_build_name() 3: { 4: // arrange 5: const string versionFile = "SolutionAssemblyVersionInfo.cs"; 6: WriteTestVersionFile(versionFile); 7: var activity = new VersionAssemblies(); 8: var arguments = new Dictionary<string, object> { 9: { "tfsBuildNumber", "CI_MyApplication.4"}, 10: { "solutionVersionFile", versionFile} 11: }; 12:   13: // act 14: var result = WorkflowInvoker.Invoke(activity, arguments); 15:   16: // assert 17: Assert.AreEqual("1.2.4.0", (string)result["newAssemblyFileVersion"]); 18: var lines = File.ReadAllLines(versionFile); 19: Assert.IsTrue(lines.Contains("[assembly: AssemblyVersion(\"1.2.0.0\")]")); 20: Assert.IsTrue(lines.Contains("[assembly: AssemblyFileVersion(\"1.2.4.0\")]")); 21: } 22: 23: private void WriteTestVersionFile(string versionFile) 24: { 25: var fileContents = "using System.Reflection;\n" + 26: "[assembly: AssemblyVersion(\"1.2.0.0\")]\n" + 27: "[assembly: AssemblyFileVersion(\"1.2.0.0\")]"; 28: File.WriteAllText(versionFile, fileContents); 29: }   At this point, the code for our AssemblyVersion activity is pretty straight forward: 1: [BuildActivity(HostEnvironmentOption.Agent)] 2: public class AssemblyVersionInfo : CodeActivity 3: { 4: [RequiredArgument] 5: public InArgument<string> FileName { get; set; } 6:   7: [RequiredArgument] 8: public InArgument<string> TfsBuildNumber { get; set; } 9:   10: public OutArgument<string> NewAssemblyFileVersion { get; set; } 11:   12: protected override void Execute(CodeActivityContext context) 13: { 14: var solutionVersionFile = this.FileName.Get(context); 15: 16: // Ensure that the file is writeable 17: var fileAttributes = File.GetAttributes(solutionVersionFile); 18: File.SetAttributes(solutionVersionFile, fileAttributes & ~FileAttributes.ReadOnly); 19:   20: // Prepare assembly versions 21: var majorMinor = GetAssemblyMajorMinorVersionBasedOnExisting(solutionVersionFile); 22: var newBuildNumber = GetNewBuildNumber(this.TfsBuildNumber.Get(context)); 23: var newAssemblyVersion = string.Format("{0}.{1}.0.0", majorMinor.Item1, majorMinor.Item2); 24: var newAssemblyFileVersion = string.Format("{0}.{1}.{2}.0", majorMinor.Item1, majorMinor.Item2, newBuildNumber); 25: this.NewAssemblyFileVersion.Set(context, newAssemblyFileVersion); 26:   27: // Perform the actual replacement 28: var contents = this.GetFileContents(newAssemblyVersion, newAssemblyFileVersion); 29: File.WriteAllText(solutionVersionFile, contents); 30:   31: // Restore the file's original attributes 32: File.SetAttributes(solutionVersionFile, fileAttributes); 33: } 34:   35: #region Private Methods 36:   37: private string GetFileContents(string newAssemblyVersion, string newAssemblyFileVersion) 38: { 39: var cs = new StringBuilder(); 40: cs.AppendLine("using System.Reflection;"); 41: cs.AppendFormat("[assembly: AssemblyVersion(\"{0}\")]", newAssemblyVersion); 42: cs.AppendLine(); 43: cs.AppendFormat("[assembly: AssemblyFileVersion(\"{0}\")]", newAssemblyFileVersion); 44: return cs.ToString(); 45: } 46:   47: private Tuple<string, string> GetAssemblyMajorMinorVersionBasedOnExisting(string filePath) 48: { 49: var lines = File.ReadAllLines(filePath); 50: var versionLine = lines.Where(x => x.Contains("AssemblyVersion")).FirstOrDefault(); 51:   52: if (versionLine == null) 53: { 54: throw new InvalidOperationException("File does not contain [assembly: AssemblyVersion] attribute"); 55: } 56:   57: return ExtractMajorMinor(versionLine); 58: } 59:   60: private static Tuple<string, string> ExtractMajorMinor(string versionLine) 61: { 62: var firstQuote = versionLine.IndexOf('"') + 1; 63: var secondQuote = versionLine.IndexOf('"', firstQuote); 64: var version = versionLine.Substring(firstQuote, secondQuote - firstQuote); 65: var versionParts = version.Split('.'); 66: return new Tuple<string, string>(versionParts[0], versionParts[1]); 67: } 68:   69: private string GetNewBuildNumber(string buildName) 70: { 71: return buildName.Substring(buildName.LastIndexOf(".") + 1); 72: } 73:   74: #endregion 75: }   At this point the final step is to incorporate this activity into the overall build template. Make a copy of the DefaultTempate.xaml – we’ll call it DefaultTemplateWithVersioning.xaml. Before the build and labeling happens, drag the VersionAssemblies activity in. Then set the LabelName variable to “BuildDetail.BuildDefinition.Name + "-" + newAssemblyFileVersion since the newAssemblyFileVersion was produced by our activity.   Configuring CI Once you add your solution to source control, you can configure CI with the build definition window as shown here. The main difference is that we’ll change the Process tab to reflect a different build number format and choose our custom build process file:   When the build completes, we’ll see the name of our project with the unique revision number:   If we look at the detailed build log for the latest build, we’ll see the label being created with our custom task:     We can now look at the history labels in TFS and see the project name with the labels (the Assignment activity I added to the workflow):   Finally, if we look at the physical assemblies that are produced, we can right-click on any assembly in Windows Explorer and see the assembly version in its properties:   Full Traceability We now have full traceability for our code. There will never be a question of what code was deployed to Production. You can always see the assembly version in the properties of the physical assembly. That can be traced back to a label in TFS where the unique revision number matches. The label in TFS gives you the complete snapshot of the code in your source control repository at the time the code was built. This type of process for full traceability has been used for many years for CI – in fact, I’ve done similar things with CCNet and SVN for quite some time. This is simply the TFS implementation of that pattern. The new features that TFS 2010 give you to make these types of customizations in your build process are quite easy once you get over the initial curve.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >