Search Results

Search found 25284 results on 1012 pages for 'test driven'.

Page 16/1012 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Load Test Manifesto

    - by jchang
    Load testing used to be a standard part of the software development, but not anymore. Now people express a preference for assessing performance on the production system. There is a lack of confidence that a load test reflects what will actually happen in production. In essence, it has become accepted that the value of load testing is not worth the cost and time, and perhaps whether there is any value at all. The main problem is the load test plan criteria – excessive focus on perceived importance...(read more)

    Read the article

  • How do you model roles / relationships with Domain Driven Design in mind?

    - by kitsune
    If I have three entities, Project, ProjectRole and Person, where a Person can be a member of different Projects and be in different Project Roles (such as "Project Lead", or "Project Member") - how would you model such a relationship? In the database, I currently have the following tablers: Project, Person, ProjectRole Project_Person with PersonId & ProjectId as PK and a ProjectRoleId as a FK Relationship. I'm really at a loss here since all domain models I come up with seem to break some "DDD" rule. Are there any 'standards' for this problem? I had a look at a Streamlined Object Modeling and there is an example what a Project and ProjectMember would look like, but AddProjectMember() in Project would call ProjectMember.AddProject(). So Project has a List of ProjectMembers, and each ProjectMember in return has a reference to the Project. Looks a bit convoluted to me. update After reading more about this subject, I will try the following: There are distinct roles, or better, model relationships, that are of a certain role type within my domain. For instance, ProjectMember is a distinct role that tells us something about the relationship a Person plays within a Project. It contains a ProjectMembershipType that tells us more about the Role it will play. I do know for certain that persons will have to play roles inside a project, so I will model that relationship. ProjectMembershipTypes can be created and modified. These can be "Project Leader", "Developer", "External Adviser", or something different. A person can have many roles inside a project, and these roles can start and end at a certain date. Such relationships are modeled by the class ProjectMember. public class ProjectMember : IRole { public virtual int ProjectMemberId { get; set; } public virtual ProjectMembershipType ProjectMembershipType { get; set; } public virtual Person Person { get; set; } public virtual Project Project { get; set; } public virtual DateTime From { get; set; } public virtual DateTime Thru { get; set; } // etc... } ProjectMembershipType: ie. "Project Manager", "Developer", "Adviser" public class ProjectMembershipType : IRoleType { public virtual int ProjectMembershipTypeId { get; set; } public virtual string Name { get; set; } public virtual string Description { get; set; } // etc... }

    Read the article

  • How can I effectively test against the Windows API?

    - by Billy ONeal
    I'm still having issues justifying TDD to myself. As I have mentioned in other questions, 90% of the code I write does absolutely nothing but Call some Windows API functions and Print out the data returned from said functions. The time spent coming up with the fake data that the code needs to process under TDD is incredible -- I literally spend 5 times as much time coming up with the example data as I would spend just writing application code. Part of this problem is that often I'm programming against APIs with which I have little experience, which forces me to write small applications that show me how the real API behaves so that I can write effective fakes/mocks on top of that API. Writing implementation first is the opposite of TDD, but in this case it is unavoidable: I do not know how the real API behaves, so how on earth am I going to be able to create a fake implementation of the API without playing with it? I have read several books on the subject, including Kent Beck's Test Driven Development, By Example, and Michael Feathers' Working Effectively with Legacy Code, which seem to be gospel for TDD fanatics. Feathers' book comes close in the way it describes breaking out dependencies, but even then, the examples provided have one thing in common: The program under test obtains input from other parts of the program under test. My programs do not follow that pattern. Instead, the only input to the program itself is the system upon which it runs. How can one effectively employ TDD on such a project?

    Read the article

  • C++ Templates: implicit conversion, no matching function for call to ctor

    - by noname
    template<class T> class test { public: test() { } test(T& e) { } }; int main() { test<double> d(4.3); return 0; } Compiled using g++ 4.4.1 with the following errors: g++ test.cpp -Wall -o test.exe test.cpp: In function 'int main()': test.cpp:18: error: no matching function for call to 'test<double>::test(double) ' test.cpp:9: note: candidates are: test<T>::test(T&) [with T = double] test.cpp:5: note: test<T>::test() [with T = double] test.cpp:3: note: test<double>::test(const test<double>&) make: *** [test.exe] Error 1 However, this works: double a=1.1; test<double> d(a); Why is this happing? Is it possible that g++ cannot implicitly convert literal expression 1.1 to double? Thanks.

    Read the article

  • What makes my code DDD (domain-driven design) qualified?

    - by oykuo
    Hi All, I'm new to DDD and am thinking about using this design technique in my project. However, what strikes me about DDD is that how basic the idea is. Unlike other design techniques such as MVC and TDD, it doesn't seems to contain any ground breaking ideas. For example, I'm sure some of you will have the same feeling that the idea of root aggregates and repositories are nothing new because when you are was writing MVC web applications you have to have one single master object (i.e. the root aggregate) that contain other minor objects (i.e. value objects and entities) in the model layer in order to send data to a strongly typed view. To me, the only new idea in DDD is probably the "Smart" entities (i.e. you are supposed to have business rules on root aggregates) Separation between value object, root aggregate and entities. Can anyone tell me if I have missed out anything here? If that's all there is to DDD, if I update one of my existing MVC application with the above 2 new ideas, can I claim it's an TDD, MVC and DDD applcation?

    Read the article

  • smartctl short test doesn't seem to complete

    - by Cédric COPY
    I am working on project which involve automated HDD testing through smartctl. The station is working fine on most product, but I have two specific products that fail the smartctl test. Those two product are both WD product (WD2500BUDT series) Smartctl behaviour is quite strange, in fact the test is launched without any problem, i wait about 2min (test length), and when i check the smartctl, i have got no result at all. It's like I hadn't launched any test (no fail, no success in smartctl result). No error return on command, nothing in syslog, .. As i said before, the test is working for other product, thousands products worked well with this test. The main smartctl command used are : smarctl -t shortest /dev/sdX #Launch test smartctl -l selftest /dev/sdX #Look at test result I have tried to use: smartctl -s on /dev/sdX or smartctl -o on /dev/sdX But doesn't change anything. The system is using Debian 6.0, smartctl v5.40 (rev 3124) x86_64, HDD are plug through SATA to PCI controller. I have 4 HDD connected at a time. Well if anyone has some hints to give with this problem, because I have no idea how can i fix this. Thanks in advance. PS: Not sure if it was a serverfault topic, sorry if i was wrong!

    Read the article

  • Auto Mocking using JustMock

    - by mehfuzh
    Auto mocking containers are designed to reduce the friction of keeping unit test beds in sync with the code being tested as systems are updated and evolve over time. This is one sentence how you define auto mocking. Of course this is a more or less formal. In a more informal way auto mocking containers are nothing but a tool to keep your tests synced so that you don’t have to go back and change tests every time you add a new dependency to your SUT or System Under Test. In Q3 2012 JustMock is shipped with built in auto mocking container. This will help developers to have all the existing fun they are having with JustMock plus they can now mock object with dependencies in a more elegant way and without needing to do the homework of managing the graph. If you are not familiar with auto mocking then I won't go ahead and educate you rather ask you to do so from contents that is already made available out there from community as this is way beyond the scope of this post. Moving forward, getting started with Justmock auto mocking is pretty simple. First, I have to reference Telerik.JustMock.Container.DLL from the installation folder along with Telerik.JustMock.DLL (of course) that it uses internally and next I will write my tests with mocking container. It's that simple! In this post first I will mock the target with dependencies using current method and going forward do the same with auto mocking container. In short the sample is all about a report builder that will go through all the existing reports, send email and log any exception in that process. This is somewhat my  report builder class looks like: Reporter class depends on the following interfaces: IReporBuilder: used to  create and get the available reports IReportSender: used to send the reports ILogger: used to log any exception. Now, if I just write the test without using an auto mocking container it might end up something like this: Now, it looks fine. However, the only issue is that I am creating the mock of each dependency that is sort of a grunt work and if you have ever changing list of dependencies then it becomes really hard to keep the tests in sync. The typical example is your ASP.NET MVC controller where the number of service dependencies grows along with the project. The same test if written with auto mocking container would look like: Here few things to observe: I didn't created mock for each dependencies There is no extra step creating the Reporter class and sending in the dependencies Since ILogger is not required for the purpose of this test therefore I can be completely ignorant of it. How cool is that ? Auto mocking in JustMock is just released and we also want to extend it even further using profiler that will let me resolve not just interfaces but concrete classes as well. But that of course starts the debate of code smell vs. working with legacy code. Feel free to send in your expert opinion in that regard using one of telerik’s official channels. Hope that helps

    Read the article

  • A new name for unit tests

    - by Will
    I never used to like unit testing. I always thought it increased the amount of work I had to do. Turns out, that's only true in terms of the actual number of lines of code you write and furthermore, this is completely offset by the increase in the number of lines of useful code that you can write in an hour with tests and test driven development. Now I love unit tests as they allow me to write useful code, that quite often works first time! (knock on wood) I have found that people are reluctant to do unit tests or start a project with test driven development if they are under strict time-lines or in an environment where others don't do it, so they don't. Kinda like, a cultural refusal to even try. I think one of the most powerful things about unit testing is the confidence that it gives you to undertake refactoring. It also gives new found hope, that I can give my code to someone else to refactor/improve, and if my unit tests still work, I can use the new version of the library that they modified, pretty much, without fear. It's this last aspect of unit testing that I think needs a new name. The unit test is more like a contract of what this code should do now, and in the future. When I hear the word testing, I think of mice in cages, with multiple experiments done on them to see the effectiveness of a compound. This is not what unit testing is, we're not trying out different code to see what is the most affective approach, we're defining what outputs we expect with what inputs. In the mice example, unit tests are more like the definitions of how the universe will work as opposed to the experiments done on the mice. Am I on crack or does anyone else see this refusal to do testing and do they think it's a similar reason they don't want to do it? What reasons do you / others give for not testing? What do you think their motivations are in not unit testing? And as a new name for unit testing that might get over some of the objections, how about jContract? (A bit Java centric I know :), or Unit Contracts?

    Read the article

  • Selenium Data Driven Testing with C#

    - by Dinesh Kanojia
    HI All, i am Dinesh Kanojia and i am new in the Automate testing and i want to perform data driven testing in selenium using ASP.NET(C#),ajax and almost all the features of jquery so any one can give me the step how to perform data driven testing using c# or some demo through which i can perform my testing thanks in advance, warm regards, Dinesh.K

    Read the article

  • Custom validation works in development but not in unit test

    - by Geolev
    I want to validate that at least one of two columns have a value in my model. I found somewhere on the web that I could create a custom validator as follows: # Check for the presence of one or another field: # :validates_presence_of_at_least_one_field :last_name, :company_name - would require either last_name or company_name to be filled in # also works with arrays # :validates_presence_of_at_least_one_field :email, [:name, :address, :city, :state] - would require email or a mailing type address module ActiveRecord module Validations module ClassMethods def validates_presence_of_at_least_one_field(*attr_names) msg = attr_names.collect {|a| a.is_a?(Array) ? " ( #{a.join(", ")} ) " : a.to_s}.join(", ") + "can't all be blank. At least one field must be filled in." configuration = { :on => :save, :message => msg } configuration.update(attr_names.extract_options!) send(validation_method(configuration[:on]), configuration) do |record| found = false attr_names.each do |a| a = [a] unless a.is_a?(Array) found = true a.each do |attr| value = record.respond_to?(attr.to_s) ? record.send(attr.to_s) : record[attr.to_s] found = !value.blank? end break if found end record.errors.add_to_base(configuration[:message]) unless found end end end end end I put this in a file called lib/acs_validator.rb in my project and added "require 'acs_validator'" to my environment.rb. This does exactly what I want. It works perfectly when I manually test it in the development environment but when I write a unit test it breaks my test environment. This is my unit test: require 'test_helper' class CustomerTest < ActiveSupport::TestCase # Replace this with your real tests. test "the truth" do assert true end test "customer not valid" do puts "customer not valid" customer = Customer.new assert !customer.valid? assert customer.errors.invalid?(:subdomain) assert_equal "Company Name and Last Name can't both be blank.", customer.errors.on(:contact_lname) end end This is my model: class Customer < ActiveRecord::Base validates_presence_of :subdomain validates_presence_of_at_least_one_field :customer_company_name, :contact_lname, :message => "Company Name and Last Name can't both be blank." has_one :service_plan end When I run the unit test, I get the following error: DEPRECATION WARNING: Rake tasks in vendor/plugins/admin_data/tasks, vendor/plugins/admin_data/tasks, and vendor/plugins/admin_data/tasks are deprecated. Use lib/tasks instead. (called from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/tasks/rails.rb:10) Couldn't drop acs_test : #<ActiveRecord::StatementInvalid: PGError: ERROR: database "acs_test" is being accessed by other users DETAIL: There are 1 other session(s) using the database. : DROP DATABASE IF EXISTS "acs_test"> acs_test already exists NOTICE: CREATE TABLE will create implicit sequence "customers_id_seq" for serial column "customers.id" NOTICE: CREATE TABLE / PRIMARY KEY will create implicit index "customers_pkey" for table "customers" NOTICE: CREATE TABLE will create implicit sequence "service_plans_id_seq" for serial column "service_plans.id" NOTICE: CREATE TABLE / PRIMARY KEY will create implicit index "service_plans_pkey" for table "service_plans" /usr/bin/ruby1.8 -I"lib:test" "/usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb" "test/unit/customer_test.rb" "test/unit/service_plan_test.rb" "test/unit/helpers/dashboard_helper_test.rb" "test/unit/helpers/customers_helper_test.rb" "test/unit/helpers/service_plans_helper_test.rb" /usr/lib/ruby/gems/1.8/gems/activerecord-2.3.8/lib/active_record/base.rb:1994:in `method_missing_without_paginate': undefined method `validates_presence_of_at_least_one_field' for #<Class:0xb7076bd0> (NoMethodError) from /usr/lib/ruby/gems/1.8/gems/will_paginate-2.3.12/lib/will_paginate/finder.rb:170:in `method_missing' from /home/george/projects/advancedcomfortcs/app/models/customer.rb:3 from /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require' from /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:31:in `require' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:158:in `require' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:265:in `require_or_load' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:224:in `depend_on' from /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.8/lib/active_support/dependencies.rb:136:in `require_dependency' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:414:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:413:in `each' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:413:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:411:in `each' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:411:in `load_application_classes' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:197:in `process' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:113:in `send' from /usr/lib/ruby/gems/1.8/gems/rails-2.3.8/lib/initializer.rb:113:in `run' from /home/george/projects/advancedcomfortcs/config/environment.rb:9 from ./test/test_helper.rb:2:in `require' from ./test/test_helper.rb:2 from ./test/unit/customer_test.rb:1:in `require' from ./test/unit/customer_test.rb:1 from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `load' from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5:in `each' from /usr/lib/ruby/gems/1.8/gems/rake-0.8.7/lib/rake/rake_test_loader.rb:5 rake aborted! Command failed with status (1): [/usr/bin/ruby1.8 -I"lib:test" "/usr/lib/ru...] (See full trace by running task with --trace) It seems to have stepped on will_paginate somehow. Does anyone have any suggestions? Is there another way to do the validation I'm attempting to do? Thanks, George

    Read the article

  • Multiple asserts in single test?

    - by Gern Blandston
    Let's say I want to write a function that validates an email address with a regex. I write a little test to check my function and write the actual function. Make it pass. However, I can come up with a bunch of different ways to test the same function ([email protected]; [email protected]; test.test.com, etc). Do I put all the incantations that I need to check in the same, single test with several ASSERTS or do I write a new test for every single thing I can think of? Thanks!

    Read the article

  • Combining RequiresSTA and Timeout attributes on a test fails

    - by Peter Lillevold
    I have a test that opens and closes a WPF Window and thus requires the STA threading apartment. To safeguard the test against the window staying open (and thus hang the test indefinitely) I wanted to use the Timeout attribute. The problem is that applying the Timeout attribute causes the test to fail on timeout regardless of whether the test works or not. Without the attribute everything works fine. My theory is that Timeout causes the test to be executed on a new thread that does not inherit the STA apartment. Is there another way to have both STA and the timeout safeguard in NUnit? My test looks something like this: [Test, RequiresSTA, Timeout(300)] public void Construct() { var window = new WindowView(); window.Loaded += (sender, args) => window.Close(); var app = new Application(); app.Run(window); try { // ...run system under test } finally { app.Shutdown(); } }

    Read the article

  • Organizing test hierarchy in clojure project

    - by Sergey
    There are two directories in a clojure project - src/ and test/. There's a file my_methods.clj in the src/calc/ directory which starts with (ns calc.my_methods...). I want to create a test file for it in test directory - test/my_methods-test.clj (ns test.my_methods-test (:require [calc.my_methods]) (:use clojure.test)) In the $CLASSPATH there are both project root directory and src/ directory. But the exception is still "Could not locate calc/my_methods__init.class or calc/my_methods.clj on classpath". What is the problem with requiring it in the test file? echo $CLASSPATH gives this: ~/project:~/project/src

    Read the article

  • Spring - PropertiesPlaceholderConfigurer not finding properties file

    - by sat
    Not sure what could be wrong. I had an app that worked all along with this <context:property-placeholder location="classpath:my.properties"/> No problems finding the properties file and hooking things up. Now, I needed to encrypt some fields in the properties file. So I ended up writing the custom PropertiesPlaceholderConfigurer and tried to wire it up like this <bean class="com.mycompany.myapp.PropertiesPlaceholderConfigurer"> <property name="location" value="classpath:my.propeties"/> </bean> With this configuration, Spring complains that it cannot find the properties file. java.io.FileNotFoundException: class path resource [my.propeties] cannot be opened because it does not exist What in addition should be done? The custom placeholder configurer package com.mycompany.myapp; import org.springframework.beans.factory.config.PropertyPlaceholderConfigurer; import org.springframework.util.ObjectUtils; import java.util.Enumeration; import java.util.Properties; public class PropertiesPlaceholderConfigurer extends PropertyPlaceholderConfigurer{ @Override protected void convertProperties(Properties props) { Enumeration<?> propertyNames = props.propertyNames(); while (propertyNames.hasMoreElements()) { String propertyName = (String) propertyNames.nextElement(); String propertyValue = props.getProperty(propertyName); if(propertyName.endsWith("encrypted")){ System.out.println("Decrypting the property " + propertyName); String convertedValue = decrypt(propertyValue); System.out.println("Decrypted the property value to " + convertedValue); if (!ObjectUtils.nullSafeEquals(propertyValue, convertedValue)) { props.setProperty(propertyName, convertedValue); } } } } } Update: Forget my custom placeholder configurer, even the spring provided one has trouble if I replace with this <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"> <property name="location" value="classpath:my.propeties"/> </bean> What is context:property-placholder doing that the bean definition can't? Full stack trace java.lang.IllegalStateException: Failed to load ApplicationContext at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:99) at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) Caused by: org.springframework.beans.factory.BeanInitializationException: Could not load properties; nested exception is java.io.FileNotFoundException: class path resource [my.propeties] cannot be opened because it does not exist at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:89) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:265) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:162) at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:609) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:464) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:121) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:100) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:250) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContextInternal(CacheAwareContextLoaderDelegate.java:64) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:91) at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) Caused by: java.io.FileNotFoundException: class path resource [my.propeties] cannot be opened because it does not exist at org.springframework.core.io.ClassPathResource.getInputStream(ClassPathResource.java:158) at org.springframework.core.io.support.EncodedResource.getInputStream(EncodedResource.java:143) at org.springframework.core.io.support.PropertiesLoaderUtils.fillProperties(PropertiesLoaderUtils.java:98) at org.springframework.core.io.support.PropertiesLoaderSupport.loadProperties(PropertiesLoaderSupport.java:175) at org.springframework.core.io.support.PropertiesLoaderSupport.mergeProperties(PropertiesLoaderSupport.java:156) at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:80) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:265) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:162) at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:609) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:464) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:121) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:100) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:250) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContextInternal(CacheAwareContextLoaderDelegate.java:64) at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:91) at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)

    Read the article

  • Test a simple multi-player (upto four players) Android game in single developer machine

    - by Kush
    I'm working on a multi-player Android game (very simple it is that it doesn't have any game-engine used). The game is based on Java Socket. Four devices will connect the game server and a new thread will manage their session. The game server will server many such sessions (having 4 players each). What I'm worried about is the testing of this game. I know it is possible to run multiple android emulators, but my development laptop is very limited in capabilities (3 GB RAM, 2 Ghz Intel Core2Duo and on-board Graphics). And I'm already using Ubuntu to develop the game so that I have more user memory available than I'd have with Windows. Hence, the laptop will burn-to-death on running 4 emulator instances. I don't have access to any android device, neither I have another machine with higher configuration. And I still have to develop and test this game. P.S. : I'm a CS student, and currently don't work anywhere, and this game is college project, so if there are any paid solutions, I cannot afford it. What can I do to test the app seamlessly? ability to test even only 4 clients (i.e. only 1 session) would suffice, its alright if I can't simulate real environment with some 10-20 active game sessions (having 4 players each).

    Read the article

  • BI Applications Test Drive: Joint Partner+Oracle Go To Market Initiatives

    - by Mike.Hallett(at)Oracle-BI&EPM
     A challenge you may be facing is how to easily show the business value of BI to a set of customers.  The key we find to achieve this is to show best in class business analytic examples specific to a business person's role and needs - e.g. "HR analytics" for HR professionals, "Spend Analytics" for procurement professionals, and so on. We have created for you, our specialised partners, the ability to run Oracle BI Applications Test Drive Workshops for your customers. These are carefully scripted to allow a customer business person (usually not IT) to navigate for themselves around a series of dashboards and analysis targetted to show how BI can help their business and drive ROI. These Oracle BI Applications Test Drive kits (in English) are now downloadable from our OMS4P/OPN portal . See it by clicking on this link:http://www.oracle.com/partners/secure/marketing/bi-apps-test-drive-519829.htmlThis kit translation into Italian, French, Spanish and German will be added to this portal soon. NOTE: These are not designed for "training" customers: they really address the need for an effective call to action for any customer you talk to who is in the early stages of exploring their options and the business benefits of a BI project, especially if they are already an Oracle applications customer (eBusiness suite, Peoplesoft, Siebel, JDE). For more demand generation kits see another blog article "Joint Partner+Oracle Go To Market Initiatives: BI Customer Event Kits"

    Read the article

  • Isolating test data in acceptance tests

    - by Matt Phillips
    I'm looking for guidance on how to keep my acceptance tests isolated. Right now the issue I'm having with being able to run the tests in parallel is the database records that are manipulated in the tests. I've written helpers that take care of doing inserts and deletes before tests are executed, to make sure the state is correct. But now I can't run them in parallel against the same database without uniquely generating the test data fields for each test. For example. Testing creating a row i'll delete everything where column A = foo and column B = bar Then I'll navigate through the UI in the test and create a record with column A = foo and column B = bar. Testing that a duplicate row is not allowed to be created. I'll insert a row with column A = foo and column B = bar and then use the UI to try and do the exact same thing. This will display an error message in the UI as expected. These tests work perfectly when ran separately and serially. But I can't run them at the same time for fear that one will create or delete a record the other is expecting. Any tips on how to structure them better so they can be run in parallel?

    Read the article

  • What technical test should I give to a job candidate

    - by Romain Braun
    I'm not sure if this is the right stackexhange website, but : I have three candidates coming in tomorrow. One has 15 years of experience in PHP, and the two others have about 1 year of experience in PHP/ frontend development. For the last ones I was thinking about a test where they would have to develop a web app allowing users to manage other users, as in : Display a list of users, display a single user, modify an user, and add extended properties to an user. This way it would feature html, css, js, ajax, php and SQL. Do you think this would be a good test? What test should I give to the first one? He needs something much more difficult, I guess. I'm also listening, if you have any advice/ideas about what makes a good developer, and what I should pay attention to in the guys' codes. I was also considering thinking outside of the box, more algorithm-related, and asked him to make the fastest function to tell if a number is a prime number, because there are a lot of optimizations you can apply to such a function. They have one day to do it.

    Read the article

  • Joining Windows 7 Professional to a Windows Server 2003 R2 x64 domain fails.

    - by Vinko Vrsalovic
    I have a windows 7 professional (spanish) laptop trying to join a Windows Server 2003 (english) domain. It detect correctly the SRV record, finding the proper domain controller, but then the join fails with the error message (snippet, because the error is in spanish) An Active Directory Domain Controller for This Domain Could Not be Contacted The DNS is correctly set, and client can ping by name and IP the server, the server can ping the client by IP. I've tested with the FW down to no avail. A host of other XP Pro clients are connected to the domain. I've restarted Net Logon and checked that Windows Time is up. Also the times are in sync between the server and the client. I'll put below diagnostics output. I'm wondering if there's anything special to be done on either the server or the client to have a Win 7 Pro join a 2k3 R2 domain. The following diagnostic information follows: netdiag /q for the DC dcdiag on the DC ipconfig /all on the Win 7 client netdiag /q on the DC: .................................. Computer Name: HI-X2 DNS Host Name: hi-x2.hi.local System info : Microsoft Windows Server 2003 R2 (Build 3790) Processor : EM64T Family 6 Model 23 Stepping 10, GenuineIntel List of installed hotfixes : KB923561 KB924667-v2 KB925398_WMP64 KB925902 KB926122 KB927891 KB929123 KB930178 KB932168 KB936357 KB938127 KB941569 KB942830 KB942831 KB943055 KB943460 KB944338-v2 KB944653 KB945553 KB946026 KB948496 KB950760 KB950762 KB950974 KB951066 KB951748 KB952004 KB952069 KB952954 KB954155 KB954550-v7 KB955069 KB955759 KB956572 KB956802 KB956803 KB956844 KB958469 KB958644 KB958869 KB959426 KB960225 KB960803 KB960859 KB961063 KB961118 KB961501 KB967715 KB967723 KB968389 KB968816 KB969059 KB969947 KB970238 KB970430 KB970483 KB971032 KB971468 KB971657 KB971737 KB971961 KB971961-IE8 KB972270 KB973037 KB973354 KB973507 KB973540 KB973687 KB973815 KB973825 KB973869 KB973904 KB973917-v2 KB974112 KB974318 KB974392 KB974571 KB975025 KB975467 KB975560 KB975713 KB976662-IE8 KB977290 KB977816 KB977914 KB978037 KB978262 KB978338 KB978542 KB978601 KB978706 KB979306 KB979309 KB979683 KB980182 KB980182-IE8 KB980232 KB980302-IE8 KB981332-IE8 KB981350 Q147222 Per interface results: Adapter : Local Area Connection Host Name. . . . . . . . . : hi-x2.hi.local IP Address . . . . . . . . : 10.0.1.199 Subnet Mask. . . . . . . . : 255.0.0.0 Default Gateway. . . . . . : 10.0.1.1 Dns Servers. . . . . . . . : 10.0.1.199 WINS service test. . . . . : Skipped Global results: [WARNING] You don't have a single interface with the 'WorkStation Service', 'Messenger Service', 'WINS' names defined. DNS test . . . . . . . . . . . . . : Passed PASS - All the DNS entries for DC are registered on DNS server '10.0.1.199'. IP Security test . . . . . . . . . : Skipped The command completed successfully dcdiag on the DC: Domain Controller Diagnosis Performing initial setup: Done gathering initial info. Doing initial required tests Testing server: Default-First-Site-Name\HI-X2 Starting test: Connectivity ......................... HI-X2 passed test Connectivity Doing primary tests Testing server: Default-First-Site-Name\HI-X2 Starting test: Replications ......................... HI-X2 passed test Replications Starting test: NCSecDesc ......................... HI-X2 passed test NCSecDesc Starting test: NetLogons ......................... HI-X2 passed test NetLogons Starting test: Advertising ......................... HI-X2 passed test Advertising Starting test: KnowsOfRoleHolders ......................... HI-X2 passed test KnowsOfRoleHolders Starting test: RidManager ......................... HI-X2 passed test RidManager Starting test: MachineAccount ......................... HI-X2 passed test MachineAccount Starting test: Services ......................... HI-X2 passed test Services Starting test: ObjectsReplicated ......................... HI-X2 passed test ObjectsReplicated Starting test: frssysvol ......................... HI-X2 passed test frssysvol Starting test: frsevent ......................... HI-X2 passed test frsevent Starting test: kccevent ......................... HI-X2 passed test kccevent Starting test: systemlog ......................... HI-X2 passed test systemlog Starting test: VerifyReferences ......................... HI-X2 passed test VerifyReferences Running partition tests on : ForestDnsZones Starting test: CrossRefValidation ......................... ForestDnsZones passed test CrossRefValidation Starting test: CheckSDRefDom ......................... ForestDnsZones passed test CheckSDRefDom Running partition tests on : DomainDnsZones Starting test: CrossRefValidation ......................... DomainDnsZones passed test CrossRefValidation Starting test: CheckSDRefDom ......................... DomainDnsZones passed test CheckSDRefDom Running partition tests on : Schema Starting test: CrossRefValidation ......................... Schema passed test CrossRefValidation Starting test: CheckSDRefDom ......................... Schema passed test CheckSDRefDom Running partition tests on : Configuration Starting test: CrossRefValidation ......................... Configuration passed test CrossRefValidation Starting test: CheckSDRefDom ......................... Configuration passed test CheckSDRefDom Running partition tests on : hi Starting test: CrossRefValidation ......................... hi passed test CrossRefValidation Starting test: CheckSDRefDom ......................... hi passed test CheckSDRefDom Running enterprise tests on : hi.local Starting test: Intersite ......................... hi.local passed test Intersite Starting test: FsmoCheck ......................... hi.local passed test FsmoCheck ipconfig /all on the Windows 7 client: Configuraci¢n IP de Windows Nombre de host. . . . . . . . . : hi-p6 Sufijo DNS principal . . . . . : Tipo de nodo. . . . . . . . . . : h¡brido Enrutamiento IP habilitado. . . : no Proxy WINS habilitado . . . . . : no Adaptador de LAN inal mbrica Conexi¢n de red inal mbrica: Sufijo DNS espec¡fico para la conexi¢n. . : Descripci¢n . . . . . . . . . . . . . . . : Intel(R) WiFi Link 5100 AGN Direcci¢n f¡sica. . . . . . . . . . . . . : 00-22-FB-63-47-A0 DHCP habilitado . . . . . . . . . . . . . : no Configuraci¢n autom tica habilitada . . . : s¡ Direcci¢n IPv4. . . . . . . . . . . . . . : 10.0.1.42(Preferido) M scara de subred . . . . . . . . . . . . : 255.255.255.0 Puerta de enlace predeterminada . . . . . : 10.0.1.1 Servidores DNS. . . . . . . . . . . . . . : 10.0.1.199 NetBIOS sobre TCP/IP. . . . . . . . . . . : habilitado Adaptador de Ethernet Conexi¢n de  rea local: Estado de los medios. . . . . . . . . . . : medios desconectados Sufijo DNS espec¡fico para la conexi¢n. . : Descripci¢n . . . . . . . . . . . . . . . : Realtek PCIe GBE Family Controller Direcci¢n f¡sica. . . . . . . . . . . . . : 00-1E-33-1F-35-B1 DHCP habilitado . . . . . . . . . . . . . : s¡ Configuraci¢n autom tica habilitada . . . : s¡ Adaptador de t£nel isatap.{8926581E-09AC-4123-906B-DA6386AD2D60}: Estado de los medios. . . . . . . . . . . : medios desconectados Sufijo DNS espec¡fico para la conexi¢n. . : Descripci¢n . . . . . . . . . . . . . . . : Adaptador ISATAP de Microsoft Direcci¢n f¡sica. . . . . . . . . . . . . : 00-00-00-00-00-00-00-E0 DHCP habilitado . . . . . . . . . . . . . : no Configuraci¢n autom tica habilitada . . . : s¡ Adaptador de t£nel Teredo Tunneling Pseudo-Interface: Sufijo DNS espec¡fico para la conexi¢n. . : Descripci¢n . . . . . . . . . . . . . . . : Teredo Tunneling Pseudo-Interface Direcci¢n f¡sica. . . . . . . . . . . . . : 00-00-00-00-00-00-00-E0 DHCP habilitado . . . . . . . . . . . . . : no Configuraci¢n autom tica habilitada . . . : s¡ Direcci¢n IPv6 . . . . . . . . . . : 2001:0:5ef5:73ba:1cec:3883:f5ff:fed5(Preferido) V¡nculo: direcci¢n IPv6 local. . . : fe80::1cec:3883:f5ff:fed5%13(Preferido) Puerta de enlace predeterminada . . . . . : :: NetBIOS sobre TCP/IP. . . . . . . . . . . : deshabilitado

    Read the article

  • Groovy JUnit test support

    - by Martin Janicek
    Good news everyone! I've implemented support for the Groovy JUnit tests which basically means you can finally use Groovy in the area where is so highly productive! You can create a new Groovy JUnit test in the New File/Groovy/Groovy JUnit test and it should behave in the same way as for Java tests. Which means if there is no JUnit setup in your project yet, you can choose between JUnit 3 and JUnit 4 template and with respect to your choice the project settings will be changed (in case of the Maven based projects the correct dependencies and plugins are added to the pom.xml and in case of the Ant based project the JUnit dependency is configured). Or if the project is already configured, the correct template will be used. After that the test skeleton is created and you can write your own code and of course run the tests together with the java ones. Some of you were asking for this feature and of course I don't expect it will be perfect from the beginning so I would be really glad to see some constructive feedback about what could be improved and/or redesigned ;] ..at the end I have to say that the feature is not active for the Ant based Java EE projects yet (I'm aware of it and it will be fixed to the NetBeans 7.3 final - actually it will be done in a few days/weeks, just want you to know). But it's already complete in all types of the Maven based projects and also for the Ant based J2SE projects. And as always, the daily build where you can try the feature can be downloaded right here, so don't hesitate to try it!

    Read the article

  • Resurrecting a 5,000 line test plan that is a decade old

    - by ale
    I am currently building a test plan for the system I am working on. The plan is 5,000 lines long and about 10 years old. The structure is like this: 1. test title precondition: some W needs to be set up, X needs to be completed action: do some Y postcondition: message saying Z is displayed 2. ... What is this type of testing called ? Is it useful ? It isn't automated.. the tests would have to be handed to some unlucky person to run through and then the results would have to be given to development. It doesn't seem efficient. Is it worth modernising this method of testing (removing tests for removed features, updating tests where different postconditions happen, ...) or would a whole different approach be more appropriate ? We plan to start unit tests but the software requires so much work to actually get 'units' to test - there are no units at present ! Thank you.

    Read the article

  • MSTest Test Context Exception Handling

    - by Flip
    Is there a way that I can get to the exception that was handled by the MSTest framework using the TestContext or some other method on a base test class? If an unhandled exception occurs in one of my tests, I'd like to spin through all the items in the exception.Data dictionary and display them to the test result to help me figure out why the test failed (we usually add data to the exception to help us debug in the production env, so I'd like to do the same for testing). Note: I am not testing that an exception was SUPPOSED TO HAPPEN (I have other tests for that), I am testing a valid case, I just need to see the exception data. Here is a code example of what I'm talking about. [TestMethod] public void IsFinanceDeadlineDateValid() { var target = new BusinessObject(); SetupBusinessObject(target); //How can I capture this in the text context so I can display all the data //in the exception in the test result... var expected = 100; try { Assert.AreEqual(expected, target.PerformSomeCalculationThatMayDivideByZero()); } catch (Exception ex) { ex.Data.Add("SomethingImportant", "I want to see this in the test result, as its important"); ex.Data.Add("Expected", expected); throw ex; } } I understand there are issues around why I probably shouldn't have such an encapsulating method, but we also have sub tests to test all the functionality of PerformSomeCalculation... However, if the test fails, 99% of the time, I rerun it passes, so I can't debug anything without this information. I would also like to do this on a GLOBAL level, so that if any test fails, I get the information in the test results, as opposed to doing it for each individual test. Here is the code that would put the exception info in the test results. public void AddDataFromExceptionToResults(Exception ex) { StringBuilder whereAmI = new StringBuilder(); var holdException = ex; while (holdException != null) { Console.WriteLine(whereAmI.ToString() + "--" + holdException.Message); foreach (var item in holdException.Data.Keys) { Console.WriteLine(whereAmI.ToString() + "--Data--" + item + ":" + holdException.Data[item]); } holdException = holdException.InnerException; } }

    Read the article

  • Test Driven Development For Complex Methods involving external dependency

    - by bill_tx
    I am implementing a Service Contract for WCF Service. As per TDD I wrote a test case to just pass it using hardcoded values. After that I started to put real logic into my Service implementation. The actual logic relies on 3-4 external service and database. What should I do to my original test case that I wrote ? If i Keep it same in order to make test pass it will have to call several other external services. So I have question in general what should I do if I write a test case for a Business Facade first using TDD and later when I add real logic, if it involves external dependency.

    Read the article

  • SqlLite/Fluent NHibernate integration test harness initialization not repeatable after large data se

    - by Mark Rogers
    In one of my main data integration test harnesses I create and use Fluent NHibernate's SingleConnectionSessionSourceForSQLiteInMemoryTesting, to get a fresh session for each test. After each test, I close the connection, session, and session factory, and throw out the nested StructureMap container they came from. This works for almost any simple data integration test I can think, including ones that utilize Fluent NHib's PersistenceSpecification object. When I test the application's lengthy database bootstrapping process, which creates and saves thousands of domain objects, I start seeing issues. It's not that the setup and tear down fails, in fact, the test successfully bootstraps the in-memory database as the application would bootstrap the real database in the production environment. The problem occurs when the database bootstrapping occurs a second time on a new in-memory database, with a new session and session factory. The error is: NHibernate.StaleStateException : Unexpected row count: 0; expected: 1 The row count is indeed Unexpected, the row that the application under test is looking for should be in the session. You see, it's not that any data from the last integration test is sticking around, it's that for some reason the session just stops working mid-database-boostrap. And I've looked everywhere for a place I might be holding on to an old session and I can't find one. I've searched through the code for static singleton objects, but there are none anywhere near the code in question. I have a couple StructureMap InstanceScope singleton's but they are getting thrown out with each nested container that is lost after every test teardown. I've tried every possible variation on disposing and closing every object involved with each test teardown and it still fails on this lengthy database bootstrap. But non-bootstrap related database tests appear to work fine. I'm starting to run out of options and may have to surrender lengthy database integration tests in favor of WatiN-based acceptance tests. Can anyone give me any clue about how I can figure out why some of my SingleConnectionSessionSourceForSQLiteInMemoryTesting aren't repeatable? Any advice at all, about how to make an NHibernate SqlLite database integration test harness repeatable?

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >