Search Results

Search found 14022 results on 561 pages for 'coded ui tests'.

Page 151/561 | < Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >

  • Grails Testing hickups

    - by egervari
    I have two testing questions. Both are probably easily answered. The first is that I wrote this unit test in Grails: void testCount() { mockDomain(UserAccount) new UserAccount(firstName: "Ken").save() new UserAccount(firstName: "Bob").save() new UserAccount(firstName: "Dave").save() assertEquals(3, UserAccount.count()) } For some reason, I get 0 returned back. Did I forget to do something? The second question is for those who use IDEA. What should I be running - IDEA's junit tests, or grails targets? I have two options. Also, why does IDEA say that my tests pass and it provides a green light even though the test above actually fails? This will really drive me nuts if I have to check the test reports in html every time I run my tests..... Help?

    Read the article

  • Generating JavaDoc style documentation

    - by Walter White
    Hi all, I would like to generate a report similar to JavaDoc so that you can real easily click on a test, result, and source. I am running HtmlUnit tests so I will have the result (html), source (request, headers, parameters, etc.), stack trace all visible so a developer or qa can go back later to review this to see what went awry. So, in the left frame, the tests will be listed along with the group they were a part of (similar to packages in javadoc). In the right frame, the results will be presented along with the source and stack trace. How can I achieve this? The HtmlUnit tests are part of the project and not a stand-alone plugin if that matters. Thanks, Walter

    Read the article

  • Can't get my head around background workers in .NET

    - by Connel
    I have wrote an application that syncs two folders together. The problem with the program is that it stops responding whilst copying files. A quick search of stack-overflow told me I need to use something called a background worker. I have read a few pages on the net about this but find it really hard to understand as I'm pretty new to programming. Below is the code for my application - how can I simply put all of the File.Copy(....) commands into their own background worker (if that's even how it works)? Below is the code for the button click event that runs the sub procedure and the sub procedure I wish to use a background worker on all the File.Copy lines. Button event: protected virtual void OnBtnSyncClicked (object sender, System.EventArgs e) { //sets running boolean to true booRunning=true; //sets progress bar to 0 prgProgressBar.Fraction = 0; //resets values used by progressbar dblCurrentStatus = 0; dblFolderSize = 0; //tests if user has entered the same folder for both target and destination if (fchDestination.CurrentFolder == fchTarget.CurrentFolder) { //creates message box MessageDialog msdSame = new MessageDialog(this, DialogFlags.Modal, MessageType.Error, ButtonsType.Close, "You cannot sync two folders that are the same"); //sets message box title msdSame.Title="Error"; //sets respone type ResponseType response = (ResponseType) msdSame.Run(); //if user clicks on close button or closes window then close message box if (response == ResponseType.Close || response == ResponseType.DeleteEvent) { msdSame.Destroy(); } return; } //tests if user has entered a target folder that is an extension of the destination folder // or if user has entered a desatination folder that is an extension of the target folder if (fchTarget.CurrentFolder.StartsWith(fchDestination.CurrentFolder) || fchDestination.CurrentFolder.StartsWith(fchTarget.CurrentFolder)) { //creates message box MessageDialog msdContains = new MessageDialog(this, DialogFlags.Modal, MessageType.Error, ButtonsType.Close, "You cannot sync a folder with one of its parent folders"); //sets message box title msdContains.Title="Error"; //sets respone type and runs message box ResponseType response = (ResponseType) msdContains.Run(); //if user clicks on close button or closes window then close message box if (response == ResponseType.Close || response == ResponseType.DeleteEvent) { msdContains.Destroy(); } return; } //gets folder size of target folder FileSizeOfTarget(fchTarget.CurrentFolder); //gets folder size of destination folder FileSizeOfDestination(fchDestination.CurrentFolder); //runs SyncTarget procedure SyncTarget(fchTarget.CurrentFolder); //runs SyncDestination procedure SyncDestination(fchDestination.CurrentFolder); //informs user process is complete prgProgressBar.Text = "Finished"; //sets running bool to false booRunning = false; } Sync sub-procedure: protected void SyncTarget (string strCurrentDirectory) { //string array of all the directories in directory string[] staAllDirectories = Directory.GetDirectories(strCurrentDirectory); //string array of all the files in directory string[] staAllFiles = Directory.GetFiles(strCurrentDirectory); //loop over each file in directory foreach (string strFile in staAllFiles) { //string of just the file's name and not its path string strFileName = System.IO.Path.GetFileName(strFile); //string containing directory in target folder string strDirectoryInsideTarget = System.IO.Path.GetDirectoryName(strFile).Substring(fchTarget.CurrentFolder.Length); //inform user as to what file is being copied prgProgressBar.Text="Syncing " + strFile; //tests if file does not exist in destination folder if (!File.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName)) { //if file does not exist copy it to destination folder, the true below means overwrite if file already exists File.Copy (strFile, fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName, true); } //tests if file does exist in destination folder if (File.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName)) { //long (number) that contains date of last write time of target file long lngTargetFileDate = File.GetLastWriteTime(strFile).ToFileTime(); //long (number) that contains date of last write time of destination file long lngDestinationFileDate = File.GetLastWriteTime(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName).ToFileTime(); //tests if target file is newer than destination file if (lngTargetFileDate > lngDestinationFileDate) { //if it is newer then copy file from target folder to destination folder File.Copy (strFile, fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName, true); } } //gets current file size FileInfo FileSize = new FileInfo(strFile); //sets file's filesize to dblCurrentStatus and adds it to current total of files dblCurrentStatus = dblCurrentStatus + FileSize.Length; double dblPercentage = dblCurrentStatus/dblFolderSize; prgProgressBar.Fraction = dblPercentage; } //loop over each folder in target folder foreach (string strDirectory in staAllDirectories) { //string containing directories inside target folder but not any higher directories string strDirectoryInsideTarget = strDirectory.Substring(fchTarget.CurrentFolder.Length); //tests if directory does not exist inside destination folder if (!Directory.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget)) { //it directory does not exisit create it Directory.CreateDirectory(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget); } //run sync on all files in directory SyncTarget(strDirectory); } } Any help will be greatly appreciated as after this the program will pretty much be finished :D

    Read the article

  • How would you make a blog with a TDD approach?

    - by Earlz
    I'm considering remaking my blog(currently in PHP, but <100 lines of non-layout code) in Ruby on Rails just for the fun of it. I want to make another project in Rails, but I should learn Rails(more than hello world) before I go to try to create a full project. Another thing I want to do while remaking my blog is to at least figure out what TDD is all about. So how would you go about taking a Test Driven approach to the creation of a blog? What tests would you write? How would you begin? Everytime I visualize writing a blog it'd end up needing a million tests for a single component to fully test it. How do I avoid writing too many tests? Also, I am making this community wiki because I intend for this to basically be made into a mini tutorial/knowledge base...

    Read the article

  • Can't get Zend Studio and PHPunit to work together

    - by dimbo
    I have a created a simple doctrine2/zend skeleton project and am trying to get unit testing working with zend studio. The tests work perfectly through the PHPunit CLI but I just can't get them to work in zend studio. It comes up with an error saying : 'No Tests was executed' and the following output in the debug window : X-Powered-By: PHP/5.2.14 ZendServer/5.0 Set-Cookie: ZendDebuggerCookie=127.0.0.1%3A10137%3A0||084|77742D65|1016; path=/ Content-type: text/html <br /> <b>Warning</b>: Unexpected character in input: '\' (ASCII=92) state=1 in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>8</b><br /> <br /> <b>Warning</b>: Unexpected character in input: '\' (ASCII=92) state=1 in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>8</b><br /> <br /> <b>Parse error</b>: syntax error, unexpected T_STRING in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>8</b><br /> The test is as follows: <?php require_once 'Zend/Application.php'; require_once 'Zend/Test/PHPUnit/ControllerTestCase.php'; abstract class ControllerTestCase extends Zend_Test_PHPUnit_ControllerTestCase { public function setUp() { $this->bootstrap = new Zend_Application( 'testing', APPLICATION_PATH . '/configs/application.ini' ); parent::setUp(); } public function tearDown() { parent::tearDown(); } } <?php class IndexControllerTest extends ControllerTestCase { public function testDoesHomePageExist() { $this->dispatch('/'); $this->assertController('index'); $this->assertAction('index'); } } <?php class ModelTestCase extends PHPUnit_Framework_TestCase { protected $em; public function setUp() { $application = new Zend_Application( 'testing', APPLICATION_PATH . '/configs/application.ini' ); $bootstrap = $application->bootstrap()->getBootstrap(); $this->em = $bootstrap->getResource('entityManager'); parent::setUp(); } public function tearDown() { parent::tearDown(); } } <?php class UserModelTest extends ModelTestCase { public function testCanInstantiateUser() { $this->assertInstanceOf('\Entities\User', new \Entities\User); } public function testCanSaveAndRetrieveUser() { $user = new \Entities\User; $user->setFirstname('wjgilmore-test'); $user->setemail('[email protected]'); $user->setpassword('jason'); $user->setAddress1('calle san antonio'); $user->setAddress2('albayzin'); $user->setSurname('testman'); $user->setConfirmed(TRUE); $this->em->persist($user); $this->em->flush(); $user = $this->em->getRepository('Entities\User')->findOneByFirstname('wjgilmore-test'); $this->assertEquals('wjgilmore-test', $user->getFirstname()); } public function testCanDeleteUser() { $user = new \Entities\User; $user = $this->em->getRepository('Entities\User')->findOneByFirstname('wjgilmore-test'); $this->em->remove($user); $this->em->flush(); } } And the bootstrap: <?php define('BASE_PATH', realpath(dirname(__FILE__) . '/../../')); define('APPLICATION_PATH', BASE_PATH . '/application'); set_include_path( '.' . PATH_SEPARATOR . BASE_PATH . '/library' . PATH_SEPARATOR . get_include_path() ); require_once 'controllers/ControllerTestCase.php'; require_once 'models/ModelTestCase.php'; Here is the new error after setting PHP Executable to 5.3 as Gordon suggested: X-Powered-By: PHP/5.3.3 ZendServer/5.0 Set-Cookie: ZendDebuggerCookie=127.0.0.1%3A10137%3A0||084|77742D65|1000; path=/ Content-type: text/html <br /> <b>Fatal error</b>: Class 'ModelTestCase' not found in <b>/var/www/z2d2/tests/application/models/UserModelTest.php</b> on line <b>4</b><br />

    Read the article

  • Can't get my head arround background workers in c#

    - by Connel
    I have wrote an application that syncs two folders together. The problem with the program is that it stops responding whilst copying files. A quick search of stack-overflow told me I need to use something called a background worker. I have read a few pages on the net about this but find it really hard to understand as I'm pretty new to programming. Below is the code for my application - how can I simply put all of the File.Copy(....) commands into their own background worker (if that's even how it works)? Below is the code for the button click event that runs the sub procedure and the sub procedure I wish to use a background worker on all the File.Copy lines. Button event: protected virtual void OnBtnSyncClicked (object sender, System.EventArgs e) { //sets running boolean to true booRunning=true; //sets progress bar to 0 prgProgressBar.Fraction = 0; //resets values used by progressbar dblCurrentStatus = 0; dblFolderSize = 0; //tests if user has entered the same folder for both target and destination if (fchDestination.CurrentFolder == fchTarget.CurrentFolder) { //creates message box MessageDialog msdSame = new MessageDialog(this, DialogFlags.Modal, MessageType.Error, ButtonsType.Close, "You cannot sync two folders that are the same"); //sets message box title msdSame.Title="Error"; //sets respone type ResponseType response = (ResponseType) msdSame.Run(); //if user clicks on close button or closes window then close message box if (response == ResponseType.Close || response == ResponseType.DeleteEvent) { msdSame.Destroy(); } return; } //tests if user has entered a target folder that is an extension of the destination folder // or if user has entered a desatination folder that is an extension of the target folder if (fchTarget.CurrentFolder.StartsWith(fchDestination.CurrentFolder) || fchDestination.CurrentFolder.StartsWith(fchTarget.CurrentFolder)) { //creates message box MessageDialog msdContains = new MessageDialog(this, DialogFlags.Modal, MessageType.Error, ButtonsType.Close, "You cannot sync a folder with one of its parent folders"); //sets message box title msdContains.Title="Error"; //sets respone type and runs message box ResponseType response = (ResponseType) msdContains.Run(); //if user clicks on close button or closes window then close message box if (response == ResponseType.Close || response == ResponseType.DeleteEvent) { msdContains.Destroy(); } return; } //gets folder size of target folder FileSizeOfTarget(fchTarget.CurrentFolder); //gets folder size of destination folder FileSizeOfDestination(fchDestination.CurrentFolder); //runs SyncTarget procedure SyncTarget(fchTarget.CurrentFolder); //runs SyncDestination procedure SyncDestination(fchDestination.CurrentFolder); //informs user process is complete prgProgressBar.Text = "Finished"; //sets running bool to false booRunning = false; } Sync sub-procedure: protected void SyncTarget (string strCurrentDirectory) { //string array of all the directories in directory string[] staAllDirectories = Directory.GetDirectories(strCurrentDirectory); //string array of all the files in directory string[] staAllFiles = Directory.GetFiles(strCurrentDirectory); //loop over each file in directory foreach (string strFile in staAllFiles) { //string of just the file's name and not its path string strFileName = System.IO.Path.GetFileName(strFile); //string containing directory in target folder string strDirectoryInsideTarget = System.IO.Path.GetDirectoryName(strFile).Substring(fchTarget.CurrentFolder.Length); //inform user as to what file is being copied prgProgressBar.Text="Syncing " + strFile; //tests if file does not exist in destination folder if (!File.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName)) { //if file does not exist copy it to destination folder, the true below means overwrite if file already exists File.Copy (strFile, fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName, true); } //tests if file does exist in destination folder if (File.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName)) { //long (number) that contains date of last write time of target file long lngTargetFileDate = File.GetLastWriteTime(strFile).ToFileTime(); //long (number) that contains date of last write time of destination file long lngDestinationFileDate = File.GetLastWriteTime(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName).ToFileTime(); //tests if target file is newer than destination file if (lngTargetFileDate > lngDestinationFileDate) { //if it is newer then copy file from target folder to destination folder File.Copy (strFile, fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName, true); } } //gets current file size FileInfo FileSize = new FileInfo(strFile); //sets file's filesize to dblCurrentStatus and adds it to current total of files dblCurrentStatus = dblCurrentStatus + FileSize.Length; double dblPercentage = dblCurrentStatus/dblFolderSize; prgProgressBar.Fraction = dblPercentage; } //loop over each folder in target folder foreach (string strDirectory in staAllDirectories) { //string containing directories inside target folder but not any higher directories string strDirectoryInsideTarget = strDirectory.Substring(fchTarget.CurrentFolder.Length); //tests if directory does not exist inside destination folder if (!Directory.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget)) { //it directory does not exisit create it Directory.CreateDirectory(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget); } //run sync on all files in directory SyncTarget(strDirectory); } } Any help will be greatly appreciated as after this the program will pretty much be finished :D

    Read the article

  • ExpectedException on TestMethod Visual Studio 2010

    - by Joop
    Today I upgraded my solution with all the underlying projects from VS2008 to VS2010. Everything went well except for my unit tests. First of all only the web projects had as target framework .NET 4. All the other projects still had .NET 3.5. I changed them all to .NET 4. Now when I debug my unit tests it breaks on every exception. In 2008 it just wouldn't pass and tell me that an exception occurred. Even when I have the ExpectedException attribute defined it stops debugging on every exception. And example of one of my tests: [TestMethod] [ExpectedException(typeof(EntityDoesNotExistException))] public void ConstructorTest() { AddressType type = new AddressType(int.MaxValue); } The EntityDoesNotExistException is a custom exception and inherits Exception.

    Read the article

  • Inspect in memory hsqldb while debugging

    - by Albert
    We're using hdsqldb in memory to run junit tests which operate against a database. The db is setup before running each test via a spring configuration. All works fine. Now when a tests fails it can be convinient to be able to inspect the values in the in memory database. Is this possible? If so how? Our url is: jdbc.url=jdbc:hsqldb:mem:testdb;sql.enforce_strict_size=true The database is destroyed after each tests. But when the debugger is running the database should also still be alive. I've tried connecting with the sqldb databaseManager. That works, but I don't see any tables or data. Any help is highly appreciated!

    Read the article

  • VisualAssert Testing in C++, Loading a test fixture.

    - by C_Bevan
    Good day, I am learning Testing in Visual Studio C++ and I have several tutorials which I have followed. I am trying to load a test fixture. I have tried to put the test .cpp file in many different places but it will still not pick up on it when I click on "Run Tests" or "Run Tests without debugging" In the tutorials I found, they seemed to load into the Test Explorer automatically, but in mine is an icon with a X + (PROJECTNAME).EXE and when I hoover over it I get the process exited without registering with the agent... this is due to the model not containing any test fixtures... How can I load my tests into the Test Explorer...or register them with my project... I've tried right click and "Add Fixture...".... but that just starts a new test file and I have the same problem. Anybody know how I solve this issue?

    Read the article

  • How does the workflow between testers doing testing and coders doing the coding for pending testing

    - by dotnetdev
    In a large company that does software development, they often have dedicated teams for build management, testing, development, and so forth. Agile or not, how does this workflow amongst teams work? I mean would the test team write unit tests and then the dev team write code to adhere to these tests (basically TDD)? And then the test team may write tests for a completely different project or have a slight quiet period until the dev team have done their coding. What possible workflows are there? This is something that interests me greatly. I know that in my current company we are doing it incorrectly (we have 1 tester about 5 devs, which is small scale) but I am not sure how exactly to draw out the ideal workflow. Many (ok, an ex-Project Manager) have tried, but all failed.

    Read the article

  • Discussion: Working TDD in a Scrum context

    - by Anders Juul
    Hi all, When I work TDD I tend to start out with the big picture and create the tests that should succeed in order for the overall assignment to be completed - it then kicks off a number of supporting classes/methods/tests as I 'dig in'. If my assignment has been planned out in detail, I would then open one task, and in order to solve it, open another and then another. Only when the overall tests are succeeding can I close the original task, which means that at any given time, I would have a number of open tasks. I find that this approach conflict somewhat with the scrum approach where, ideally, I should be able to take and close a task within a day's work - and never have more than one task open at a time. I'm looking for input about how you manage this in your project - references to articles are also very welcome, I'm sure this has been debated thoroughly somewhere... The 'answer tick' will be awarded to best comment/reference. Thanks for any input, Anders, Denmark

    Read the article

  • Can I use multiple step definition files with SpecFlow?

    - by Roger Lipscombe
    I'm using SpecFlow to do some BDD-style testing. Some of my features are UI tests, so they use WatiN. Some aren't UI tests, so they don't. At the moment, I have a single StepDefinitions.cs file, covering all of my features. I have a BeforeScenario step that initializes WatiN. This means that all of my tests start up Internet Explorer, whether they need it or not. Is there any way in SpecFlow to have a particular feature file associated with a particular set of step definitions? Or am I approaching this from the wrong angle?

    Read the article

  • How can a Windows program temporarily change its time zone?

    - by Rob Kennedy
    I've written a function to return the time_t value corresponding to midnight on a given day. When there is no midnight for a given day, it returns the earliest time available; that situation can occur, for example, when Egypt enters daylight-saving time. This year, the time change takes effect at midnight on the night of April 29, so the clock goes directly from 23:59 to 01:00. Now I'm writing unit tests for this function, and one of the tests should replicate the Egypt scenario. In Unix, I can accomplish it like this: putenv("TZ", "Egypt", true); tzset(); After doing that, further calls to localtime behave as if they're in Egypt instead of Minnesota, and my tests pass. Merely setting the environment variable doesn't have any effect on Windows, though. What can I do to make the unit test think it's somewhere else without affecting the rest of the programs running on the system?

    Read the article

  • How can I get Hudson to be able to access JUnit?

    - by Bedwyr Humphreys
    I've got Hudson running on TOMCAT, it can build my Netbeans project using the ant build.xml, but it won't run any of my unit tests because of what I assume is a problem with the classpath: package org.junit does not exist [javac] import org.junit.After; [javac] ^ But I've got the junit-4.8.1.jar on the classpath in /etc/environment and I can successfuly run the junit tests from a console using java org.junit.runner.JUnitCore org.junit.tests.AllTests My CLASSPATH is set to /home/bedwyr/junit4.8.1/junit-4.8.1.jar:. Am I going wrong somewhere or is there anything else I need to set? [edit] What I did was to export/include (using the ide) all libraries (including Junit) hudson then reads all it needs from the subversion repo. I then ran into an issue with exposing hudson to the internet, and pretty soon gave up on tomcat on ubuntu server (again, to do with the tomcat security manager) - glassfish is a lot smoother and that's where I am now - apache front end with ajp_proxy to hudson on glassfish.

    Read the article

  • How doe we name test methods where we are checking for more than one condition?

    - by Sandbox
    I follow the technique specified in Roy Osherove's The Art Of Unit Testing book while naming test methods - MethodName_Scenario_Expectation. It suits perfectly well for my 'unit' tests. But,for tests that I write in 'controller' or 'coordinator' class, there isn't necessarily a method which I want to test. For these tests, I generate multiple conditions which make up one scenario and then I verify the expectation. For example, I may set some properties on different instances, generate an event and then verify that my expectations from controller/coordinator is being met. Now, my controller handles events using a private event handler. Here my scenario is that, I set some properties, say 3 condition1,condition2 and condition3 Also, my scenario includes an event is raised I don't have a method name as my event handler is private. How do I name such a test method?

    Read the article

  • JSF 2 -- Composite component with optional listener attribute on f:ajax

    - by Dave Maple
    I have a composite component that looks something like this: <!DOCTYPE html> <html xmlns:h="http://java.sun.com/jsf/html" xmlns:f="http://java.sun.com/jsf/core" xmlns:dm="http://davemaple.com/dm-taglib" xmlns:rich="http://richfaces.org/rich" xmlns:cc="http://java.sun.com/jsf/composite" xmlns:fn="http://java.sun.com/jsp/jstl/functions" xmlns:ui="http://java.sun.com/jsf/facelets" xmlns:a4j="http://richfaces.org/a4j"> <cc:interface> <cc:attribute name="styleClass" /> <cc:attribute name="textBoxStyleClass" /> <cc:attribute name="inputTextId" /> <cc:attribute name="labelText" /> <cc:attribute name="tabindex" /> <cc:attribute name="required" default="false" /> <cc:attribute name="requiredMessage" /> <cc:attribute name="validatorId" /> <cc:attribute name="converterId" /> <cc:attribute name="title"/> <cc:attribute name="style"/> <cc:attribute name="unicodeSupport" default="false"/> <cc:attribute name="tooltip" default="false"/> <cc:attribute name="tooltipText" default=""/> <cc:attribute name="tooltipText" default=""/> <cc:attribute name="onfail" default=""/> <cc:attribute name="onpass" default=""/> </cc:interface> <cc:implementation> <ui:param name="converterId" value="#{! empty cc.attrs.converterId ? cc.attrs.converterId : 'universalConverter'}" /> <ui:param name="validatorId" value="#{! empty cc.attrs.validatorId ? cc.attrs.validatorId : 'universalValidator'}" /> <ui:param name="component" value="#{formFieldBean.getComponent(cc.attrs.inputTextId)}" /> <ui:param name="componentValid" value="#{((facesContext.maximumSeverity == null and empty component.valid) or component.valid) ? true : false}" /> <ui:param name="requiredMessage" value="#{! empty cc.attrs.requiredMessage ? cc.attrs.requiredMessage : msg['validation.generic.requiredMessage']}" /> <ui:param name="clientIdEscaped" value="#{fn:replace(cc.clientId, ':', '\\\\\\\\:')}" /> <h:panelGroup layout="block" id="#{cc.attrs.inputTextId}ValidPanel" style="display:none;"> <input type="hidden" id="#{cc.attrs.inputTextId}Valid" value="#{componentValid}" /> </h:panelGroup> <dm:outputLabel for="#{cc.clientId}:#{cc.attrs.inputTextId}" id="#{cc.attrs.inputTextId}Label">#{cc.attrs.labelText}</dm:outputLabel> <dm:inputText styleClass="#{cc.attrs.textBoxStyleClass}" tabindex="#{cc.attrs.tabindex}" id="#{cc.attrs.inputTextId}" required="#{cc.attrs.required}" requiredMessage="#{requiredMessage}" title="#{cc.attrs.title}" unicodeSupport="#{cc.attrs.unicodeSupport}"> <f:validator validatorId="#{validatorId}" /> <f:converter converterId="#{converterId}" /> <cc:insertChildren /> <f:ajax event="blur" execute="@this" render="#{cc.attrs.inputTextId}ValidPanel #{cc.attrs.inputTextId}Msg" onevent="on#{cc.attrs.inputTextId}Event" /> </dm:inputText> <rich:message for="#{cc.clientId}:#{cc.attrs.inputTextId}" id="#{cc.attrs.inputTextId}Msg" style="display: none;" /> <script> function on#{cc.attrs.inputTextId}Event(e) { if(e.status == 'success') { $('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}').trigger($('##{cc.attrs.inputTextId}Valid').val()=='true'?'pass':'fail'); } } $('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}').bind('fail', function() { $('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}, ##{clientIdEscaped}\\:#{cc.attrs.inputTextId}Label, ##{cc.attrs.inputTextId}Msg, ##{cc.id}Msg').addClass('error'); $('##{cc.id}Msg').html($('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}Msg').html()); #{cc.attrs.onfail} }).bind('pass', function() { $('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}, ##{clientIdEscaped}\\:#{cc.attrs.inputTextId}Label, ##{cc.attrs.inputTextId}Msg, ##{cc.id}Msg').removeClass('error'); $('##{cc.id}Msg').html($('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}Msg').html()); #{cc.attrs.onpass} }); </script> <a4j:region rendered="#{facesContext.maximumSeverity != null and !componentValid}"> <script> $(document).ready(function() { $('##{clientIdEscaped}\\:#{cc.attrs.inputTextId}').trigger('fail'); }); </script> </a4j:region> </cc:implementation> </html> I'd like to be able to add an optional "listener" attribute which if defined would add an event listener to my f:ajax but I'm having trouble figuring out how to accomplish this. Any help would be appreciated.

    Read the article

  • First time unit testing (in silverlight)

    - by Jakob
    Hi I've searched some other posts, but most of them assumed that people knew what they were doing in their unit testing, and frankly I don't. I see the idea behind unit testing, and I'm coding an silverlight application much in the blind right now, and I'd like to write some unit tests to kind of be sure I'm on the right path. I'd like to be able to use the SL4 vs 2010 silverlight unit test project template, to keep it simple and not use external tools. So what I need an answer for are questions like: what are the methods of unit testing? what are the differences between unit tests, and automated unit tests? How do I meaningfully unit test in silverlight? What should I be aware of while unit testing (in silverlight) ? Also should I implement some kind of IRepository pattern in my silverlight app to make unit testing easier?

    Read the article

  • Integration Testing an Entire *Existing* Application (w/ automatic execution of test suite)

    - by Ev
    Hi there, I have just joined a team working on an existing Java web app. I have been tasked with creating an automated integration test suite that should run when developers commit to our continuous integration server (TeamCity), which automatically deploys to our staging server - so really the tests will be run against our staging web app server. I have read a lot of stuff about automated integration testing with frameworks like Watir, Selenium and RWebSpec. I have created tests in all of these and while I prefer Watir, I am open to anything. The thing that hasn't become clear to me is how to create an entire test suite for an application, and how to have that suite execute in it's entirety upon execution of some script. I can happily create individual tests of varying complexity, but there is a gap in my knowledge about how to tie everything together into something useful. Does anyone have any advice on how to create a full test suite and have it execute automatically? Thanks!

    Read the article

  • Throughput measurements

    - by dotsid
    I wrote simple load testing tool for testing performance of Java modules. One problem I faced is algorithm of throughput measurements. Tests are executed in several thread (client configure how much times test should be repeated), and execution time is logged. So, when tests are finished we have following history: 4 test executions 2 threads 36ms overall time - idle * test execution 5ms 9ms 4ms 13ms T1 |-*****-*********-****-*************-| 3ms 6ms 7ms 11ms T2 |-***-******-*******-***********-----| <-----------------36ms---------------> For the moment I calculate throughput (per second) in a following way: 1000 / overallTime * threadCount. But there is problem. What if one thread will complete it's own tests more quickly (for whatever reason): 3ms 3ms 3ms 3ms T1 |-***-***-***-***----------------| 3ms 6ms 7ms 11ms T2 |-***-******-*******-***********-| <--------------32ms--------------> In this case actual throughput is much better because of measured throughput is bounded by the most slow thread. So, my question is how should I measure throughput of code execution in multithreaded environment.

    Read the article

  • Scala simple dummy project.

    - by Lukasz Lew
    Currently my whole work cycle is: edit foo.scala fsc foo.scala && scala -cp . FooMain But my project is getting bigger and I would like to split files, make unit tests, etc. But I'm too lazy for reading sbt documentation and doing whatever needs to be done to get a sbt's "Makefile". Similarly for unit tests (there are so many frameworks, which to choose?) What would make my day is a simple zipped dummy project with a dummy unit tests using sbt. Do you know whether such thing exists?

    Read the article

  • Why can't I see any data in the Google App Engine *Development* Console?

    - by willem
    I run my google app engine application in one of two ways... Directly by using the application from http://localhost:8080 Or execute unit tests from http://localhost:8080/test When I create entities by using the application directly, the data is visible in the Development Console (dataStore view). However, when I execute the unit tests... even if they succeed and I can put() and get() data, the data does not show in the dataStore view. Any idea why I can't see my data? Even though it is there? Notes: I use GAEUnit for unit tests. the data stored mostly consists of StringProperties(). I use Python and run Django on top of the GAE, don't know if that matters.

    Read the article

  • dynamic silverlight content

    - by Jeremy
    I am starting a silverlight project where I have tests for students to complete. I want to have some sort of framework so I can build the tests and store them in a database, delivering the content dynamically so I can continually develop new types of tests without having to re-deply the application. The content will have to be more than just xaml, as there may need to be some logic to determine if answers are correct, or to do some random generation of questions. I'm looking for suggestions on how to go about building a framework that supports this. Are there some best practices, or examples? Should each test type just be a seperate silverlight control, or should I use 1 silverlight "container" application that can display dynamic content?

    Read the article

  • Do I have to create a static library to test my application?

    - by Christopher Gateley
    I'm just getting started with TDD and am curious as to what approaches others take to run their tests. For reference, I am using the google testing framework, but I believe the question is applicable to most other testing frameworks and to languages other than C/C++. My general approach so far has been to do either one of three things: Write the majority of the application in a static library, then create two executables. One executable is the application itself, while the other is the test runner with all of the tests. Both link to the static library. Embed the testing code directly into the application itself, and enable or disable the testing code using compiler flags. This is probably the best approach I've used so far, but clutters up the code a bit. Embed the testing code directly into the application itself, and, given certain command-line switches either run the application itself or run the tests embedded in the application. None of these solutions are particularly elegant... How do you do it?

    Read the article

  • Is Pex (Test generation) really usefull tool?

    - by Yauheni Sivukha
    Yes, it is possible to generate tests on boundary values for functions like "Sum" or "Divide". Pex is a good tool here. But more often we create tests on business behaviour. Let's consider example from classic Beck's tdd book: [Test] public void ShouldRoundOnCreation() { Money money = new Money(20.678); Assert.AreEqual(20.68,money.Amount); Assert.AreEqual(2068,money.Cents); } Can this test be generated? No :) 95 % of tests in my projects check business logic, and can not be generated. Pex (Especially in pair with Moles) can give 100% code coverage, but a high code coverage rate of a test suite does never indicate, that code is well tested - It only gives false confidence that everything is tested. And this is very dangerous. So, the question is - Is Pex really usefull tool?

    Read the article

  • CDI @Conversation not propagated with handleNavigation()

    - by Thomas Kernstock
    I have a problem with the propagation of a long runnig conversation when I redirect the view by the handleNavigation() method. Here is my test code: I have a conversationscoped bean and two views: conversationStart.xhtml is called in Browser with URL http://localhost/tests/conversationStart.jsf?paramTestId=ParameterInUrl <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xmlns:h="http://java.sun.com/jsf/html" xmlns:f="http://java.sun.com/jsf/core"> <f:metadata> <f:viewParam name="paramTestId" value="#{conversationTest.fieldTestId}" /> <f:event type="preRenderView" listener="#{conversationTest.preRenderView}" /> </f:metadata> <h:head> <title>Conversation Test</title> </h:head> <h:body> <h:form> <h2>Startpage Test Conversation with Redirect</h2> <h:messages /> <h:outputText value="Testparameter: #{conversationTest.fieldTestId}"/><br /> <h:outputText value="Logged In: #{conversationTest.loggedIn}"/><br /> <h:outputText value="Conversation ID: #{conversationTest.convID}"/><br /> <h:outputText value="Conversation Transient: #{conversationTest.convTransient}"/><br /> <h:commandButton action="#{conversationTest.startLogin}" value="Login ->" rendered="#{conversationTest.loggedIn==false}" /><br /> <h:commandLink action="/tests/conversationLogin.xhtml?faces-redirect=true" value="Login ->" rendered="#{conversationTest.loggedIn==false}" /><br /> </h:form> <h:link outcome="/tests/conversationLogin.xhtml" value="Login Link" rendered="#{conversationTest.loggedIn==false}"> <f:param name="cid" value="#{conversationTest.convID}"></f:param> </h:link> </h:body> </html> The Parameter is written to the beanfield and displayed in the view correctly. There are 3 different possibilites to navigate to the next View. All 3 work fine. The beanfield shows up the next view (conversationLogin.xhtml) too: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xmlns:h="http://java.sun.com/jsf/html" xmlns:f="http://java.sun.com/jsf/core"> <h:head> <title>Conversation Test</title> </h:head> <h:body> <h:form> <h2>Loginpage Test Conversation with Redirect</h2> <h:messages /> <h:outputText value="Testparameter: #{conversationTest.fieldTestId}"/><br /> <h:outputText value="Logged In: #{conversationTest.loggedIn}"/><br /> <h:outputText value="Conversation ID: #{conversationTest.convID}"/><br /> <h:outputText value="Conversation Transient: #{conversationTest.convTransient}"/><br /> <h:commandButton action="#{conversationTest.login}" value="Login And Return" /><br /> </h:form> </h:body> </html> When I return to the Startpage by clicking the button the conversation bean still contains all values. So everything is fine. Here is the bean: package test; import java.io.Serializable; import javax.annotation.PostConstruct; import javax.enterprise.context.Conversation; import javax.enterprise.context.ConversationScoped; import javax.faces.event.ComponentSystemEvent; import javax.inject.Inject; import javax.inject.Named; @Named @ConversationScoped public class ConversationTest implements Serializable{ private static final long serialVersionUID = 1L; final String CONVERSATION_NAME="longRun"; @Inject Conversation conversation; private boolean loggedIn; private String fieldTestId; @PostConstruct public void init(){ if(conversation.isTransient()){ conversation.begin(CONVERSATION_NAME); System.out.println("New Conversation started"); } loggedIn=false; } public String getConvID(){ return conversation.getId(); } public boolean isConvTransient(){ return conversation.isTransient(); } public boolean getLoggedIn(){ return loggedIn; } public String startLogin(){ return "/tests/conversationLogin.xhtml?faces-redirect=true"; } public String login(){ loggedIn=true; return "/tests/conversationStart.xhtml?faces-redirect=true"; } public void preRenderView(ComponentSystemEvent ev) { // if(!loggedIn){ // System.out.println("Will redirect to Login"); // FacesContext ctx = FacesContext.getCurrentInstance(); // ctx.getApplication().getNavigationHandler().handleNavigation(ctx, null, "/tests/conversationLogin.xhtml?faces-redirect=true"); // ctx.renderResponse(); // } } public void setFieldTestId(String fieldTestId) { System.out.println("fieldTestID was set to: "+fieldTestId); this.fieldTestId = fieldTestId; } public String getFieldTestId() { return fieldTestId; } } Now comes the problem !! As soon as I try to redirect the page in the preRenderView method of the bean (just uncomment the code in the method), using handleNavigation() the bean is created again in the next view instead of using the allready created instance. Although the cid parameter is propagated to the next view ! Has anybody an idea what's wrong ? best regards Thomas

    Read the article

< Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >