Search Results

Search found 5214 results on 209 pages for 'j unit 122'.

Page 1/209 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • A new name for unit tests

    - by Will
    I never used to like unit testing. I always thought it increased the amount of work I had to do. Turns out, that's only true in terms of the actual number of lines of code you write and furthermore, this is completely offset by the increase in the number of lines of useful code that you can write in an hour with tests and test driven development. Now I love unit tests as they allow me to write useful code, that quite often works first time! (knock on wood) I have found that people are reluctant to do unit tests or start a project with test driven development if they are under strict time-lines or in an environment where others don't do it, so they don't. Kinda like, a cultural refusal to even try. I think one of the most powerful things about unit testing is the confidence that it gives you to undertake refactoring. It also gives new found hope, that I can give my code to someone else to refactor/improve, and if my unit tests still work, I can use the new version of the library that they modified, pretty much, without fear. It's this last aspect of unit testing that I think needs a new name. The unit test is more like a contract of what this code should do now, and in the future. When I hear the word testing, I think of mice in cages, with multiple experiments done on them to see the effectiveness of a compound. This is not what unit testing is, we're not trying out different code to see what is the most affective approach, we're defining what outputs we expect with what inputs. In the mice example, unit tests are more like the definitions of how the universe will work as opposed to the experiments done on the mice. Am I on crack or does anyone else see this refusal to do testing and do they think it's a similar reason they don't want to do it? What reasons do you / others give for not testing? What do you think their motivations are in not unit testing? And as a new name for unit testing that might get over some of the objections, how about jContract? (A bit Java centric I know :), or Unit Contracts?

    Read the article

  • Is 192.168.122.1 a valid IP?

    - by Louise Hoffman
    From my understanding the networks is as follows Class A: 10.0.0.1 - 10.255.255.254 Class B: 172.16.0.1 - 172.16.255.254 Class C: 192.168.0.1 - 192.168.0.254 But then I look at ifconfig virbr0 on my Linux computer: virbr0 Link encap:Ethernet HWaddr 42:40:99:CB:02:7F inet addr:192.168.122.1 Bcast:192.168.122.255 Mask:255.255.255.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:16 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:0 (0.0 b) TX bytes:2842 (2.7 KiB) Here the IP address is 192.168.122.1. Is that an allowed IP? And if so, is 192.168 than actually a Class B network?

    Read the article

  • Integrating JavaScript Unit Tests with Visual Studio

    - by Stephen Walther
    Modern ASP.NET web applications take full advantage of client-side JavaScript to provide better interactivity and responsiveness. If you are building an ASP.NET application in the right way, you quickly end up with lots and lots of JavaScript code. When writing server code, you should be writing unit tests. One big advantage of unit tests is that they provide you with a safety net that enable you to safely modify your existing code – for example, fix bugs, add new features, and make performance enhancements -- without breaking your existing code. Every time you modify your code, you can execute your unit tests to verify that you have not broken anything. For the same reason that you should write unit tests for your server code, you should write unit tests for your client code. JavaScript is just as susceptible to bugs as C#. There is no shortage of unit testing frameworks for JavaScript. Each of the major JavaScript libraries has its own unit testing framework. For example, jQuery has QUnit, Prototype has UnitTestJS, YUI has YUI Test, and Dojo has Dojo Objective Harness (DOH). The challenge is integrating a JavaScript unit testing framework with Visual Studio. Visual Studio and Visual Studio ALM provide fantastic support for server-side unit tests. You can easily view the results of running your unit tests in the Visual Studio Test Results window. You can set up a check-in policy which requires that all unit tests pass before your source code can be committed to the source code repository. In addition, you can set up Team Build to execute your unit tests automatically. Unfortunately, Visual Studio does not provide “out-of-the-box” support for JavaScript unit tests. MS Test, the unit testing framework included in Visual Studio, does not support JavaScript unit tests. As soon as you leave the server world, you are left on your own. The goal of this blog entry is to describe one approach to integrating JavaScript unit tests with MS Test so that you can execute your JavaScript unit tests side-by-side with your C# unit tests. The goal is to enable you to execute JavaScript unit tests in exactly the same way as server-side unit tests. You can download the source code described by this project by scrolling to the end of this blog entry. Rejected Approach: Browser Launchers One popular approach to executing JavaScript unit tests is to use a browser as a test-driver. When you use a browser as a test-driver, you open up a browser window to execute and view the results of executing your JavaScript unit tests. For example, QUnit – the unit testing framework for jQuery – takes this approach. The following HTML page illustrates how you can use QUnit to create a unit test for a function named addNumbers(). <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html> <head> <title>Using QUnit</title> <link rel="stylesheet" href="http://github.com/jquery/qunit/raw/master/qunit/qunit.css" type="text/css" /> </head> <body> <h1 id="qunit-header">QUnit example</h1> <h2 id="qunit-banner"></h2> <div id="qunit-testrunner-toolbar"></div> <h2 id="qunit-userAgent"></h2> <ol id="qunit-tests"></ol> <div id="qunit-fixture">test markup, will be hidden</div> <script type="text/javascript" src="http://code.jquery.com/jquery-latest.js"></script> <script type="text/javascript" src="http://github.com/jquery/qunit/raw/master/qunit/qunit.js"></script> <script type="text/javascript"> // The function to test function addNumbers(a, b) { return a+b; } // The unit test test("Test of addNumbers", function () { equals(4, addNumbers(1,3), "1+3 should be 4"); }); </script> </body> </html> This test verifies that calling addNumbers(1,3) returns the expected value 4. When you open this page in a browser, you can see that this test does, in fact, pass. The idea is that you can quickly refresh this QUnit HTML JavaScript test driver page in your browser whenever you modify your JavaScript code. In other words, you can keep a browser window open and keep refreshing it over and over while you are developing your application. That way, you can know very quickly whenever you have broken your JavaScript code. While easy to setup, there are several big disadvantages to this approach to executing JavaScript unit tests: You must view your JavaScript unit test results in a different location than your server unit test results. The JavaScript unit test results appear in the browser and the server unit test results appear in the Visual Studio Test Results window. Because all of your unit test results don’t appear in a single location, you are more likely to introduce bugs into your code without noticing it. Because your unit tests are not integrated with Visual Studio – in particular, MS Test -- you cannot easily include your JavaScript unit tests when setting up check-in policies or when performing automated builds with Team Build. A more sophisticated approach to using a browser as a test-driver is to automate the web browser. Instead of launching the browser and loading the test code yourself, you use a framework to automate this process. There are several different testing frameworks that support this approach: · Selenium – Selenium is a very powerful framework for automating browser tests. You can create your tests by recording a Firefox session or by writing the test driver code in server code such as C#. You can learn more about Selenium at http://seleniumhq.org/. LTAF – The ASP.NET team uses the Lightweight Test Automation Framework to test JavaScript code in the ASP.NET framework. You can learn more about LTAF by visiting the project home at CodePlex: http://aspnet.codeplex.com/releases/view/35501 jsTestDriver – This framework uses Java to automate the browser. jsTestDriver creates a server which can be used to automate multiple browsers simultaneously. This project is located at http://code.google.com/p/js-test-driver/ TestSwam – This framework, created by John Resig, uses PHP to automate the browser. Like jsTestDriver, the framework creates a test server. You can open multiple browsers that are automated by the test server. Learn more about TestSwarm by visiting the following address: https://github.com/jeresig/testswarm/wiki Yeti – This is the framework introduced by Yahoo for automating browser tests. Yeti uses server-side JavaScript and depends on Node.js. Learn more about Yeti at http://www.yuiblog.com/blog/2010/08/25/introducing-yeti-the-yui-easy-testing-interface/ All of these frameworks are great for integration tests – however, they are not the best frameworks to use for unit tests. In one way or another, all of these frameworks depend on executing tests within the context of a “living and breathing” browser. If you create an ASP.NET Unit Test then Visual Studio will launch a web server before executing the unit test. Why is launching a web server so bad? It is not the worst thing in the world. However, it does introduce dependencies that prevent your code from being tested in isolation. One of the defining features of a unit test -- versus an integration test – is that a unit test tests code in isolation. Another problem with launching a web server when performing unit tests is that launching a web server can be slow. If you cannot execute your unit tests quickly, you are less likely to execute your unit tests each and every time you make a code change. You are much more likely to fall into the pit of failure. Launching a browser when performing a JavaScript unit test has all of the same disadvantages as launching a web server when performing an ASP.NET unit test. Instead of testing a unit of JavaScript code in isolation, you are testing JavaScript code within the context of a particular browser. Using the frameworks listed above for integration tests makes perfect sense. However, I want to consider a different approach for creating unit tests for JavaScript code. Using Server-Side JavaScript for JavaScript Unit Tests A completely different approach to executing JavaScript unit tests is to perform the tests outside of any browser. If you really want to test JavaScript then you should test JavaScript and leave the browser out of the testing process. There are several ways that you can execute JavaScript on the server outside the context of any browser: Rhino – Rhino is an implementation of JavaScript written in Java. The Rhino project is maintained by the Mozilla project. Learn more about Rhino at http://www.mozilla.org/rhino/ V8 – V8 is the open-source Google JavaScript engine written in C++. This is the JavaScript engine used by the Chrome web browser. You can download V8 and embed it in your project by visiting http://code.google.com/p/v8/ JScript – JScript is the JavaScript Script Engine used by Internet Explorer (up to but not including Internet Explorer 9), Windows Script Host, and Active Server Pages. Internet Explorer is still the most popular web browser. Therefore, I decided to focus on using the JScript Script Engine to execute JavaScript unit tests. Using the Microsoft Script Control There are two basic ways that you can pass JavaScript to the JScript Script Engine and execute the code: use the Microsoft Windows Script Interfaces or use the Microsoft Script Control. The difficult and proper way to execute JavaScript using the JScript Script Engine is to use the Microsoft Windows Script Interfaces. You can learn more about the Script Interfaces by visiting http://msdn.microsoft.com/en-us/library/t9d4xf28(VS.85).aspx The main disadvantage of using the Script Interfaces is that they are difficult to use from .NET. There is a great series of articles on using the Script Interfaces from C# located at http://www.drdobbs.com/184406028. I picked the easier alternative and used the Microsoft Script Control. The Microsoft Script Control is an ActiveX control that provides a higher level abstraction over the Window Script Interfaces. You can download the Microsoft Script Control from here: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=d7e31492-2595-49e6-8c02-1426fec693ac After you download the Microsoft Script Control, you need to add a reference to it to your project. Select the Visual Studio menu option Project, Add Reference to open the Add Reference dialog. Select the COM tab and add the Microsoft Script Control 1.0. Using the Script Control is easy. You call the Script Control AddCode() method to add JavaScript code to the Script Engine. Next, you call the Script Control Run() method to run a particular JavaScript function. The reference documentation for the Microsoft Script Control is located at the MSDN website: http://msdn.microsoft.com/en-us/library/aa227633%28v=vs.60%29.aspx Creating the JavaScript Code to Test To keep things simple, let’s imagine that you want to test the following JavaScript function named addNumbers() which simply adds two numbers together: MvcApplication1\Scripts\Math.js function addNumbers(a, b) { return 5; } Notice that the addNumbers() method always returns the value 5. Right-now, it will not pass a good unit test. Create this file and save it in your project with the name Math.js in your MVC project’s Scripts folder (Save the file in your actual MVC application and not your MVC test application). Creating the JavaScript Test Helper Class To make it easier to use the Microsoft Script Control in unit tests, we can create a helper class. This class contains two methods: LoadFile() – Loads a JavaScript file. Use this method to load the JavaScript file being tested or the JavaScript file containing the unit tests. ExecuteTest() – Executes the JavaScript code. Use this method to execute a JavaScript unit test. Here’s the code for the JavaScriptTestHelper class: JavaScriptTestHelper.cs   using System; using System.IO; using Microsoft.VisualStudio.TestTools.UnitTesting; using MSScriptControl; namespace MvcApplication1.Tests { public class JavaScriptTestHelper : IDisposable { private ScriptControl _sc; private TestContext _context; /// <summary> /// You need to use this helper with Unit Tests and not /// Basic Unit Tests because you need a Test Context /// </summary> /// <param name="testContext">Unit Test Test Context</param> public JavaScriptTestHelper(TestContext testContext) { if (testContext == null) { throw new ArgumentNullException("TestContext"); } _context = testContext; _sc = new ScriptControl(); _sc.Language = "JScript"; _sc.AllowUI = false; } /// <summary> /// Load the contents of a JavaScript file into the /// Script Engine. /// </summary> /// <param name="path">Path to JavaScript file</param> public void LoadFile(string path) { var fileContents = File.ReadAllText(path); _sc.AddCode(fileContents); } /// <summary> /// Pass the path of the test that you want to execute. /// </summary> /// <param name="testMethodName">JavaScript function name</param> public void ExecuteTest(string testMethodName) { dynamic result = null; try { result = _sc.Run(testMethodName, new object[] { }); } catch { var error = ((IScriptControl)_sc).Error; if (error != null) { var description = error.Description; var line = error.Line; var column = error.Column; var text = error.Text; var source = error.Source; if (_context != null) { var details = String.Format("{0} \r\nLine: {1} Column: {2}", source, line, column); _context.WriteLine(details); } } throw new AssertFailedException(error.Description); } } public void Dispose() { _sc = null; } } }     Notice that the JavaScriptTestHelper class requires a Test Context to be instantiated. For this reason, you can use the JavaScriptTestHelper only with a Visual Studio Unit Test and not a Basic Unit Test (These are two different types of Visual Studio project items). Add the JavaScriptTestHelper file to your MVC test application (for example, MvcApplication1.Tests). Creating the JavaScript Unit Test Next, we need to create the JavaScript unit test function that we will use to test the addNumbers() function. Create a folder in your MVC test project named JavaScriptTests and add the following JavaScript file to this folder: MvcApplication1.Tests\JavaScriptTests\MathTest.js /// <reference path="JavaScriptUnitTestFramework.js"/> function testAddNumbers() { // Act var result = addNumbers(1, 3); // Assert assert.areEqual(4, result, "addNumbers did not return right value!"); }   The testAddNumbers() function takes advantage of another JavaScript library named JavaScriptUnitTestFramework.js. This library contains all of the code necessary to make assertions. Add the following JavaScriptnitTestFramework.js to the same folder as the MathTest.js file: MvcApplication1.Tests\JavaScriptTests\JavaScriptUnitTestFramework.js var assert = { areEqual: function (expected, actual, message) { if (expected !== actual) { throw new Error("Expected value " + expected + " is not equal to " + actual + ". " + message); } } }; There is only one type of assertion supported by this file: the areEqual() assertion. Most likely, you would want to add additional types of assertions to this file to make it easier to write your JavaScript unit tests. Deploying the JavaScript Test Files This step is non-intuitive. When you use Visual Studio to run unit tests, Visual Studio creates a new folder and executes a copy of the files in your project. After you run your unit tests, your Visual Studio Solution will contain a new folder named TestResults that includes a subfolder for each test run. You need to configure Visual Studio to deploy your JavaScript files to the test run folder or Visual Studio won’t be able to find your JavaScript files when you execute your unit tests. You will get an error that looks something like this when you attempt to execute your unit tests: You can configure Visual Studio to deploy your JavaScript files by adding a Test Settings file to your Visual Studio Solution. It is important to understand that you need to add this file to your Visual Studio Solution and not a particular Visual Studio project. Right-click your Solution in the Solution Explorer window and select the menu option Add, New Item. Select the Test Settings item and click the Add button. After you create a Test Settings file for your solution, you can indicate that you want a particular folder to be deployed whenever you perform a test run. Select the menu option Test, Edit Test Settings to edit your test configuration file. Select the Deployment tab and select your MVC test project’s JavaScriptTest folder to deploy. Click the Apply button and the Close button to save the changes and close the dialog. Creating the Visual Studio Unit Test The very last step is to create the Visual Studio unit test (the MS Test unit test). Add a new unit test to your MVC test project by selecting the menu option Add New Item and selecting the Unit Test project item (Do not select the Basic Unit Test project item): The difference between a Basic Unit Test and a Unit Test is that a Unit Test includes a Test Context. We need this Test Context to use the JavaScriptTestHelper class that we created earlier. Enter the following test method for the new unit test: [TestMethod] public void TestAddNumbers() { var jsHelper = new JavaScriptTestHelper(this.TestContext); // Load JavaScript files jsHelper.LoadFile("JavaScriptUnitTestFramework.js"); jsHelper.LoadFile(@"..\..\..\MvcApplication1\Scripts\Math.js"); jsHelper.LoadFile("MathTest.js"); // Execute JavaScript Test jsHelper.ExecuteTest("testAddNumbers"); } This code uses the JavaScriptTestHelper to load three files: JavaScripUnitTestFramework.js – Contains the assert functions. Math.js – Contains the addNumbers() function from your MVC application which is being tested. MathTest.js – Contains the JavaScript unit test function. Next, the test method calls the JavaScriptTestHelper ExecuteTest() method to execute the testAddNumbers() JavaScript function. Running the Visual Studio JavaScript Unit Test After you complete all of the steps described above, you can execute the JavaScript unit test just like any other unit test. You can use the keyboard combination CTRL-R, CTRL-A to run all of the tests in the current Visual Studio Solution. Alternatively, you can use the buttons in the Visual Studio toolbar to run the tests: (Unfortunately, the Run All Impacted Tests button won’t work correctly because Visual Studio won’t detect that your JavaScript code has changed. Therefore, you should use either the Run Tests in Current Context or Run All Tests in Solution options instead.) The results of running the JavaScript tests appear side-by-side with the results of running the server tests in the Test Results window. For example, if you Run All Tests in Solution then you will get the following results: Notice that the TestAddNumbers() JavaScript test has failed. That is good because our addNumbers() function is hard-coded to always return the value 5. If you double-click the failing JavaScript test, you can view additional details such as the JavaScript error message and the line number of the JavaScript code that failed: Summary The goal of this blog entry was to explain an approach to creating JavaScript unit tests that can be easily integrated with Visual Studio and Visual Studio ALM. I described how you can use the Microsoft Script Control to execute JavaScript on the server. By taking advantage of the Microsoft Script Control, we were able to execute our JavaScript unit tests side-by-side with all of our other unit tests and view the results in the standard Visual Studio Test Results window. You can download the code discussed in this blog entry from here: http://StephenWalther.com/downloads/Blog/JavaScriptUnitTesting/JavaScriptUnitTests.zip Before running this code, you need to first install the Microsoft Script Control which you can download from here: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=d7e31492-2595-49e6-8c02-1426fec693ac

    Read the article

  • Unit testing newbie team needs to unit test

    - by Walter
    I'm working with a new team that has historically not done ANY unit testing. My goal is for the team to eventually employ TDD (Test Driven Development) as their natural process. But since TDD is such a radical mind shift for a non-unit testing team I thought I would just start off with writing unit tests after coding. Has anyone been in a similar situation? What's an effective way to get a team to be comfortable with TDD when they've not done any unit testing? Does it make sense to do this in a couple of steps? Or should we dive right in and face all the growing pains at once?? EDIT Just for clarification, there is no one on the team (other than myself) who has ANY unit testing exposure/experience. And we are planning on using the unit testing functionality built into Visual Studio.

    Read the article

  • Unit testing best practices for a unit testing newbie

    - by wilhil
    In recent years, I have only written small components for people in larger projects or small tools. I have never written a unit test and it always seems like learning how to write them and actually making one takes a lot longer than simply firing up the program and testing for real. I am just about to start a fairly large scale project that could take a few months to complete and whilst I will try to test elements as I write them (like always), I am wondering if unit testing could save me time. I was just wondering if anyone could give good advice: Should I be looking at unit testing at the start of the project and possibly adopt a TDD approach. Should I just write tests as I go along, after each section is complete. Should I complete the project and then write unit tests at the end.

    Read the article

  • Drivers and firmware for the LiteOn ihap-122-9 DVD drive

    - by Sandy
    I'm trying to replace the DVD drive in my old PC. LiteOn.com is a mess and I can't find a single working driver or firmware update there, or anywhere. Windows XP tries to use a default, generic driver dated 2001. (About 9 years before this drive even existed.) http://www.firmwarehq.com/download_1..._6L0H.EXE.html This correctly finds my LiteOn ihap-122-9 DVD Drive. It correctly finds that I'm currently using firmware 6L0F. It correctly tries to install 6L0H. It completes 100% but then just fails and says "contact your vendor". Does anyone know why? Where can I actually get drivers... and firmware updates... that actually work for the ihap-122-9? Apparently, the newest driver IS the 1 made 9 years before the drive existed. (Unbelievable.) And the latest firmware is the 1 that is already in the drive. (Common.) No other drive I've had in this computer ever had a problem. This brand new LiteOn is doing this: Opening MY COMPUTER now takes 60 seconds. MY COMPUTER marking the drive as "DVD F:" takes another 30 seconds. MY COMPUTER showing "Batman II" title takes another 15 seconds. Clicking and running the movie will take another 30 seconds for the main-menu to appear. The movie starts about 20 seconds later. The movie runs fine for 1-2 seconds... then stops for 5 seconds.... then starts again and plays for 1-2 seconds. Repeats for 2 hours. (It happens with all store-bought DVDs and all home-made DVDs.)

    Read the article

  • Unit testing code paths

    - by Michael
    When unit testing using expectations, you define a set of method calls and corresponding results for those calls. These define the path through the method that you want to test. I have read that unit tests should not duplicate the code. But when you define these expectations, isn't that duplicating the code, or at least the process? How do you know when you're duplicating functionality under test?

    Read the article

  • What are tangible advantages to proper Unit Tests over Functional Test called unit tests

    - by Jackie
    A project I am working on has a bunch of legacy tests that were not properly mocked out. Because of this the only dependency it has is EasyMock, which doesn't support statics, constructors with arguments, etc. The tests instead rely on database connections and such to "run" the tests. Adding powermock to handle these cases is being shot down as cost prohibitive due to the need to upgrade the existing project to support it (Another discussion). My questions are, what are the REAL world tangible benifits of proper unit testing I can use to push back? Are there any? Am I just being a stickler by saying that bad unit tests (even if they work) are bad? Is code coverage just as effective?

    Read the article

  • Colleague unwilling to use unit tests "as it's more to code"

    - by m.edmondson
    A colleague is unwilling to use unit tests and instead opting for a quick test, pass it to the users, and if all is well it is published live. Needless to say some bugs do get through. I mentioned we should think about using unit tests - but she was all against it once it was realised more code would have to be written. This leaves me in the position of modifying something and not being sure the output is the same, especially as her code is spaghetti and I try to refactor it when I get a chance. So whats the best way forward for me?

    Read the article

  • unit level testing, agile, and refactoring

    - by dsollen
    I'm working on a very agile development system, a small number of people with my doing the vast majority of progaming myself. I've gotten to the testing phase and find myself writing mostly functional level testing, which I should in theory be leavning for our tester (in practice I don't entirely...trust our tester to detect and identify defects enough to leave him the sole writter of functional tests). In theory what I should be writing is Unit level tests. However, I'm not sure it's worth the expense. Unit testing takes some time to do, more then functional testing since I have to set up mocks and plugs into smaller units that weren't design to run in issolation. More importantly, I find I refactor and redesign heavily-part of this is due to my inherriting code that needed heavy redesign and is still being cleaned up, but even once I've finished removing parts that need work I'm sure in the act of expanding the code I'll still do a decent amount of refactoring and redesign. It feels as if I will break my unit tests, forcing wasted time to refactor them as well, often due to unit test, by definition, having to be coupled so closely to the code structure. So.is it worth all the wasted time when functional tests, that will never break when I refactor/redesign, should find most defects? Do unit tests really provide that much extra defect detetection over through functional? and how does one create good unit tests that work with very quick and agile code that is modified rapidly? ps, I would be fine/happy with links to anything one considers an excellent resource for how to 'do' unit testing in a highly changing enviroment. edit: to clarify I am doing a bit of very unoffical TDD, I just seem to be writing tests on what would be considered a functional level rather then unit level. I think part of this is becaus I own nearly all of the project I don't feel I need to limit the scope as much; and part of it is that it's daunting to think of trying to go back and retroactively add the unit tests needed to cover enough code that I can feel comfortable testing only a unit without the full functionality and trust that unit still works with the rest of the units.

    Read the article

  • Database unit testing is now available for SSDT

    - by jamiet
    Good news was announced yesterday for those that are using SSDT and want to write unit tests, unit testing functionality is now available. The announcement was made on the SSDT team blog in post Available Today: SSDT—December 2012. Here are a few thoughts about this news. Firstly, there seems to be a general impression that database unit testing was not previously available for SSDT – that’s not entirely true. Database unit testing was most recently delivered in Visual Studio 2010 and any database unit tests written therein work perfectly well against SQL Server databases created using SSDT (why wouldn’t they – its just a database after all). In other words, if you’re running SSDT inside Visual Studio 2010 then you could carry on freely writing database unit tests; some of the tight integration between the two (e.g. right-click on an object in SQL Server Object Explorer and choose to create a unit test) was not there – but I’ve never found that to be a problem. I am currently working on a project that uses SSDT for database development and have been happily running VS2010 database unit tests for a few months now. All that being said, delivery of database unit testing for SSDT is now with us and that is good news, not least because we now have the ability to create unit tests in VS2012. We also get tight integration with SSDT itself, the like of which I mentioned above. Having now had a look at the new features I was delighted to find that one of my big complaints about database unit testing has been solved. As I reported here on Connect a refactor operation would cause unit test code to get completely mangled. See here the before and after from such an operation: SELECT    * FROM    bi.ProcessMessageLog pml INNER JOIN bi.[LogMessageType] lmt     ON    pml.[LogMessageTypeId] = lmt.[LogMessageTypeId] WHERE    pml.[LogMessage] = 'Ski[LogMessageTypeName]of message: IApplicationCanceled' AND        lmt.[LogMessageType] = 'Warning'; which is obviously not ideal. Thankfully that seems to have been solved with this latest release. One disappointment about this new release is that the process for running tests as part of a CI build has not changed from the horrendously complicated process required previously. Check out my blog post Setting up database unit testing as part of a Continuous Integration build process [VS2010 DB Tools - Datadude] for instructions on how to do it. In that blog post I describe it as “fiddly” – I was being kind when I said that! @Jamiet

    Read the article

  • psexec failing with return code 122 when used from Windows service

    - by Jeremy McGee
    I've written a WCF service as a wrapper around a C# utility we've written that uses the SysInternals psexec utility to run jobs on a remote system. psexec is invoked from C# with command-line parameters that specify the domain, user and password to use. All works fine when I invoke the C# utility from PowerShell locally. However, when I run the utility from the WCF service we see a return code of 122, which corresponds to (?) "The data area passed to a system call is too small". psexec is running against Windows Server 2008. The credentials I'm passing are local administrator, in the same domain as the machine that hosts the service wrapping the utility.

    Read the article

  • Survey: how do you unit test your T-SQL?

    - by Alexander Kuznetsov
    How do you unit test your T-SQL? Which libraries/tools do you use? What percentage of your code is covered by unit tests and how do you measure it? Do you think the time and effort which you invested in your unit testing harness has paid off or not? Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • xen 4.1 host priodically dropping network packets of domU

    - by Dyutiman Chakraborty
    I have xen 4.1 Host running on a ubuntu 12.04 LTS Server with ip 153.x.x.54. I have setup 2 VMs on it, namely, "dev.mydomain.com" and "web.mydomain.com" with ips 195.X.X.2 and 195.x.x.3 respectively. For network the VMs connect through xendbr0 (xen-bridge), and can accces the network properly. I can also login to the VMs with ssh with no issue. However when I ping any of the VMs, there is a high amount of periodic packet drop. If I the ping the xen host (dom0) there is no packet drop. Following is a output of "tcpdump | grep ICMP" on dOM0 while I was pinging one of the domU tcpdump: verbose output suppressed, use -v or -vv for full protocol decode listening on eth0, link-type EN10MB (Ethernet), capture size 65535 bytes 05:19:55.682493 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 30, length 64 05:19:56.691144 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 31, length 64 05:19:57.698776 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 32, length 64 05:19:58.706784 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 33, length 64 05:19:59.714751 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 34, length 64 05:20:00.723144 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 35, length 64 05:20:01.730349 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 36, length 64 05:20:02.739017 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 37, length 64 05:20:03.746806 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 38, length 64 05:20:06.770326 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 41, length 64 05:20:07.778801 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 42, length 64 05:20:08.786481 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 43, length 64 05:20:09.794720 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 44, length 64 05:20:10.802395 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 45, length 64 05:20:11.810770 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 46, length 64 05:20:12.818511 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 47, length 64 05:20:13.826817 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 48, length 64 05:20:14.835125 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 49, length 64 05:20:15.842138 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3460, seq 50, length 64 05:20:18.274072 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 1, length 64 05:20:19.282347 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 2, length 64 05:20:20.290746 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 3, length 64 05:20:21.297910 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 4, length 64 05:20:22.305656 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 5, length 64 05:20:23.314369 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 6, length 64 05:20:24.322055 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 7, length 64 05:20:25.329782 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 8, length 64 05:20:26.338473 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 9, length 64 05:20:27.346411 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 10, length 64 05:20:28.354175 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 11, length 64 05:20:29.361640 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 12, length 64 05:20:30.370026 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 13, length 64 05:20:31.377696 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 14, length 64 05:20:32.386151 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 15, length 64 05:20:33.394118 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 16, length 64 05:20:34.402058 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 17, length 64 05:20:35.409002 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 18, length 64 05:20:36.417692 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > web.mydomain.com: ICMP echo request, id 3461, seq 19, length 64 05:20:36.496916 IP6 fe80::3285:a9ff:feec:fc69 > ip6-allnodes: HBH ICMP6, multicast listener querymax resp delay: 1000 addr: ::, length 24 05:20:36.499112 IP6 fe80::21c:c0ff:fe6c:c091 > ff02::1:ff6c:c091: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff6c:c091, length 24 05:20:36.507041 IP6 fe80::227:eff:fe11:fa3f > ff02::1:ff00:2: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff00:2, length 24 05:20:36.523919 IP6 fe80::21c:c0ff:fe77:6257 > ff02::1:ff77:6257: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff77:6257, length 24 05:20:36.544785 IP6 fe80::54:ff:fe12:ea9a > ff02::1:ff12:ea9a: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff12:ea9a, length 24 05:20:36.581740 IP6 fe80::5604:a6ff:fef1:6da7 > ff02::1:fff1:6da7: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:fff1:6da7, length 24 05:20:36.600103 IP6 fe80::8a8:8aa0:5e18:917a > ff02::1:ff18:917a: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff18:917a, length 24 05:20:36.601989 IP6 fe80::227:eff:fe11:fa3e > ff02::1:ff11:fa3e: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff11:fa3e, length 24 05:20:36.611090 IP6 fe80::dcad:56ff:fe57:3bbe > ff02::1:ff57:3bbe: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff57:3bbe, length 24 05:20:36.660521 IP6 fe80::54:ff:fe02:1d31 > ff02::1:ff00:6: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff00:6, length 24 05:20:36.698871 IP6 fe80::21e:8cff:feb4:9f89 > ff02::1:ffb4:9f89: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ffb4:9f89, length 24 05:20:36.776548 IP6 fe80::54:ff:fe12:ea9a > ff02::1:ff01:7: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff01:7, length 24 05:20:36.781910 IP6 fe80::54:ff:fe8f:6dd > ff02::1:ff00:3: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff00:3, length 24 05:20:36.865475 IP6 fe80::21c:c0ff:fe4a:ae9f > ff02::1:ff4a:ae9f: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff4a:ae9f, length 24 05:20:36.908333 IP6 fe80::dcad:45ff:fe90:84db > ff02::1:ff90:84db: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff90:84db, length 24 05:20:36.919653 IP6 fe80::54:ff:fe12:ea9a > ff02::1:ff00:7: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff00:7, length 24 05:20:36.924276 IP6 fe80::59a2:2a4a:2082:6dee > ff02::1:ff82:6dee: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff82:6dee, length 24 05:20:37.001905 IP6 fe80::54:ff:fe8f:6dd > ff02::1:ff8f:6dd: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff8f:6dd, length 24 05:20:37.042403 IP6 fe80::54:ff:fe95:54f2 > ff02::1:ff95:54f2: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff95:54f2, length 24 05:20:37.090992 IP6 fe80::21c:c0ff:fe77:62ac > ff02::1:ff77:62ac: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff77:62ac, length 24 05:20:37.098118 IP6 fe80::d63d:7eff:fe01:b67f > ff02::1:ff01:b67f: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff01:b67f, length 24 05:20:37.118784 IP6 fe80::54:ff:fe12:ea9a > ff02::202: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::202, length 24 05:20:37.168548 IP6 fe80::54:ff:fe02:1d31 > ff02::1:ff02:1d31: HBH ICMP6, multicast listener reportmax resp delay: 0 addr: ff02::1:ff02:1d31, length 24 05:20:41.743286 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 1, length 64 05:20:41.743542 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 1, length 64 05:20:42.743859 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 2, length 64 05:20:42.743952 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 2, length 64 05:20:43.745689 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 3, length 64 05:20:43.745777 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 3, length 64 05:20:44.746706 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 4, length 64 05:20:44.746796 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 4, length 64 05:20:45.747986 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 5, length 64 05:20:45.748082 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 5, length 64 05:20:46.749834 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 6, length 64 05:20:46.749920 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 6, length 64 05:20:47.750838 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 7, length 64 05:20:47.751182 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 7, length 64 05:20:48.751909 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 8, length 64 05:20:48.751991 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 8, length 64 05:20:49.752542 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 9, length 64 05:20:49.752620 IP dev.mydomain.com > ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in: ICMP echo reply, id 3463, seq 9, length 64 05:20:50.754246 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 10, length 64 05:20:51.753856 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 11, length 64 05:20:52.752868 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 12, length 64 05:20:53.754174 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 13, length 64 05:20:54.753972 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 14, length 64 05:20:55.753814 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 15, length 64 05:20:56.753391 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 16, length 64 05:20:57.753683 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 17, length 64 05:20:58.753487 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 18, length 64 05:20:59.754013 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 19, length 64 05:21:00.753169 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 20, length 64 05:21:01.753757 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 21, length 64 05:21:02.753307 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 22, length 64 05:21:03.753021 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 23, length 64 05:21:04.753628 IP ABTS-North-Dynamic-226.X.X.122.airtelbroadband.in > dev.mydomain.com: ICMP echo request, id 3463, seq 24, length 64 ^C479 packets captured 718 packets received by filter 238 packets dropped by kernel 3 packets dropped by interface You see the ping request is not responed to initially, then for a moment it is replied back and then again no reply. I have tried everything (to the best of my knowledge) to fix this, but can't find any answer Any help will be greatly appreciated Thanks.

    Read the article

  • Are "TDD Tests" different to Unit Tests?

    - by asgeo1
    I read this article about TDD and unit testing: http://stephenwalther.com/blog/archive/2009/04/11/tdd-tests-are-not-unit-tests.aspx I think it was an excellent article. The author makes a distinction between what he calls "TDD Tests" and unit testing. They appear to be different tests to him. Previous to reading this article I thought unit tests were a by-product of TDD. I didn't realise you might also create "TDD tests". The author seems to imply that creating unit tests is not enough for TDD as the granularity of a unit test is too small for what we are trying to achieve with TDD. So his TDD tests might test a few classes at once. At the end of the article there is some discussion from the author with some other people about whether there really is a distinction between "TDD Tests" and unit testing. Seems to be some contention around this idea. The example "TDD tests" the author showed at the end of the article just looked like normal MVC unit tests to me - perhaps "TDD tests" vs unit tests is just a matter of semantics? I would like to hear some more opinions on this, and whether there is / isn't a distinction between the two tests.

    Read the article

  • Do Repeat Yourself in Unit Tests

    - by João Angelo
    Don’t get me wrong I’m a big supporter of the DRY (Don’t Repeat Yourself) Principle except however when it comes to unit tests. Why? Well, in my opinion a unit test should be a self-contained group of actions with the intent to test a very specific piece of code and should not depend on externals shared with other unit tests. In a typical unit test we can divide its code in two major groups: Preparation of preconditions for the code under test; Invocation of the code under test. It’s in the first group that you are tempted to refactor common code in several unit tests into helper methods that can then be called in each one of them. Another way to not duplicate code is to use the built-in infrastructure of some unit test frameworks such as SetUp/TearDown methods that automatically run before and after each unit test. I must admit that in the past I was guilty of both charges but what at first seemed a good idea since I was removing code duplication turnout to offer no added value and even complicate the process when a given test fails. We love unit tests because of their rapid feedback when something goes wrong. However, this feedback requires most of the times reading the code for the failed test. Given this, what do you prefer? To read a single method or wander through several methods like SetUp/TearDown and private common methods. I say it again, do repeat yourself in unit tests. It may feel wrong at first but I bet you won’t regret it later.

    Read the article

  • Unit testing statically typed functional code

    - by back2dos
    I wanted to ask you people, in which cases it makes sense to unit test statically typed functional code, as written in haskell, scala, ocaml, nemerle, f# or haXe (the last is what I am really interested in, but I wanted to tap into the knowledge of the bigger communities). I ask this because from my understanding: One aspect of unit tests is to have the specs in runnable form. However when employing a declarative style, that directly maps the formalized specs to language semantics, is it even actually possible to express the specs in runnable form in a separate way, that adds value? The more obvious aspect of unit tests is to track down errors that cannot be revealed through static analysis. Given that type safe functional code is a good tool to code extremely close to what your static analyzer understands. However a simple mistake like using x instead of y (both being coordinates) in your code cannot be covered. However such a mistake could also arise while writing the test code, so I am not sure whether its worth the effort. Unit tests do introduce redundancy, which means that when requirements change, the code implementing them and the tests covering this code must both be changed. This overhead of course is about constant, so one could argue, that it doesn't really matter. In fact, in languages like Ruby it really doesn't compared to the benefits, but given how statically typed functional programming covers a lot of the ground unit tests are intended for, it feels like it's a constant overhead one can simply reduce without penalty. From this I'd deduce that unit tests are somewhat obsolete in this programming style. Of course such a claim can only lead to religious wars, so let me boil this down to a simple question: When you use such a programming style, to which extents do you use unit tests and why (what quality is it you hope to gain for your code)? Or the other way round: do you have criteria by which you can qualify a unit of statically typed functional code as covered by the static analyzer and hence needs no unit test coverage?

    Read the article

  • Unit and Integration testing: How can it become a reflex

    - by LordOfThePigs
    All the programmers in my team are familiar with unit testing and integration testing. We have all worked with it. We have all written tests with it. Some of us even have felt an improved sense of trust in his/her own code. However, for some reason, writing unit/integration tests has not become a reflex for any of the members of the team. None of us actually feel bad when not writing unit tests at the same time as the actual code. As a result, our codebase is mostly uncovered by unit tests, and projects enter production untested. The problem with that, of course is that once your projects are in production and are already working well, it is virtually impossible to obtain time and/or budget to add unit/integration testing. The members of my team and myself are already familiar with the value of unit testing (1, 2) but it doesn't seem to help bringing unit testing into our natural workflow. In my experience making unit tests and/or a target coverage mandatory just results in poor quality tests and slows down team members simply because there is no self-generated motivation to produce these tests. Also as soon as pressure eases, unit tests are not written any more. My question is the following: Is there any methods that you have experimented with that helps build a dynamic/momentum inside the team, leading to people naturally wanting to create and maintain those tests?

    Read the article

  • What is the objective of unit testing?

    - by user728750
    I've been working with C# for the last 2 years, and I've never done any unit testing. I just need to know what the objective of unit testing is. What kind of results do we expect from unit testing? Is code quality checked by unit testing? In my view, unit testing is the job of testers; if that is true, then as a developer why would I need to write test code if the tester does the unit testing? Why should I write extra code for testing? Do I need to maintain a separate copy of a project for unit testing?

    Read the article

  • How can I delete everything after the first column in Notepad++?

    - by Bob J
    I'm trying to get rid of everything after a column in Notepad++. Column mode is not an option. Is it possible? What I have 70.97.110.40 159 ms [n/a] 21 70.97.117.177 134 ms [n/a] 21 70.97.120.10 75 ms [n/a] 21 70.97.122.105 87 ms www.portless.net 21 70.97.122.106 89 ms www.popovetsky.org 21 70.97.122.107 95 ms www.psmythe.net 21 70.97.122.104 98 ms wasabi.prostructure.com 21 70.97.122.108 89 ms crm.prostructure.com 21 70.97.122.109 87 ms internal.prostructure.com21 What I want 70.97.110.40 70.97.117.177 70.97.120.10 70.97.122.105 70.97.122.106 70.97.122.107 70.97.122.104 70.97.122.108 70.97.122.109 Thanks

    Read the article

  • Is unit testing development or testing?

    - by Rubio
    I had a discussion with a testing manager about the role of unit and integration testing. She requested that developers report what they have unit and integration tested and how. My perspective is that unit and integration testing are part of the development process, not the testing process. Beyond semantics what I mean is that unit and integration tests should not be included in the testing reports and systems testers should not be concerned about them. My reasoning is based on two things. Unit and integration tests are planned and performed against an interface and a contract, always. Regardless of whether you use formalized contracts you still test what e.g. a method is supposed to do, i.e. a contract. In integration testing you test the interface between two distinct modules. The interface and the contract determine when the test passes. But you always test a limited part of the whole system. Systems testing on the other hand is planned and performed against the system specifications. The spec determines when the test passes. I don't see any value in communicating the breadth and depth of unit and integration tests to the (systems) tester. Suppose I write a report that lists what kind of unit tests are performed on a particular business layer class. What is he/she supposed to take away from that? Judging what should and shouldn't be tested from that is a false conclusion because the system may still not function the way the specs require even though all unit and integration tests pass. This might seem like useless academic discussion but if you work in a strictly formal environment as I do, it's actually important in determining how we do things. Anyway, am I totally wrong? (Sorry for the long post.)

    Read the article

  • Does unit testing lead to premature generalization (specifically in the context of C++)?

    - by Martin
    Preliminary notes I'll not go into the distinction of the different kinds of test there are, there are already a few questions on these sites regarding that. I'll take what's there and that says: unit testing in the sense of "testing the smallest isolatable unit of an application" from which this question actually derives The isolation problem What is the smallest isolatable unit of a program. Well, as I see it, it (highly?) depends on what language you are coding in. Micheal Feathers talks about the concept of a seam: [WEwLC, p31] A seam is a place where you can alter behavior in your program without editing in that place. And without going into the details, I understand a seam -- in the context of unit testing -- to be a place in a program where your "test" can interface with your "unit". Examples Unit test -- especially in C++ -- require from the code under test to add more seams that would be strictly called for for a given problem. Example: Adding a virtual interface where non-virtual implementation would have been sufficient Splitting -- generalizing(?) -- a (smallish) class further "just" to facilitate adding a test. Splitting a single-executable project into seemingly "independent" libs, "just" to facilitate compiling them independently for the tests. The question I'll try a few versions that hopefully ask about the same point: Is the way that Unit Tests require one to structure an application's code "only" beneficial for the unit tests or is it actually beneficial to the applications structure. Is the generalization code need to exhibit to be unit-testable useful for anything but the unit tests? Does adding unit tests force one to generalize unnecessarily? Is the shape unit tests force on code "always" also a good shape for the code in general as seen from the problem domain? I remember a rule of thumb that said don't generalize until you need to / until there's a second place that uses the code. With Unit Tests, there's always a second place that uses the code -- namely the unit test. So is this reason enough to generalize?

    Read the article

  • Automated unit testing, integration testing or acceptance testing

    - by bjarkef
    TDD and unit testing seems to be the big rave at the moment. But it is really that useful compared to other forms of automated testing? Intuitively I would guess that automated integration testing is way more useful than unit testing. In my experience the most bugs seems to be in the interaction between modules, and not so much the actual (usual limited) logic of each unit. Also regressions often happened because of changing interfaces between modules (and changed pre and post-conditions.) Am I misunderstanding something, or why are unit testing getting so much focus compared to integration testing? It is simply because it is assumed that integration testing is something you have, and unit testing is the next thing we need to learn to apply as developers? Or maybe unit testing simply yields the highest gain compared to the complexity of automating it? What are you experience with automated unit testing, automated integration testing, and automated acceptance testing, and in your experience what has yielded the highest ROI? and why? If you had to pick just one form of testing to be automated on your next project, which would it be? Thanks in advance.

    Read the article

  • CppUnit for unit-testing executable files?

    - by hagubear
    I am not sure if anyone has done it. I am trying to do something that is in general, uncommon i.e. unit-testing executable (Windows) or ELFs (Linux). I know that CppUnit provides a good unit testing facility, but I have never used it for unit-testing (used UnitTest++). I hear rumours that you can unit-test executables too. Does anyone have the experience in this? A relevant post regarding the philosophy of it was here

    Read the article

  • Should I demand unit-testing from programmers?

    - by Morten
    I work at a place, where we buy a lot of IT-projects. We are currently producing a standard for systems-requirements for the requisition of future projects. In that process, We are discussing whether or not we can demand automated unit testing from our suppliers. I firmly believe, that proper automated unit-testing is the only way to document the quality and stability of the code. Everyone else seems to think that unit-testing is an optional method that concerns the supplier alone. Thus, we will make no demands of automated unit-testing, continous testing, coverage-reports, inspections of unit-tests or any of the kind. I find this policy extremely frustrating. Am I totally out of line here? Please provide me with arguments for any of the oppinions.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >