Search Results

Search found 4783 results on 192 pages for 'tests'.

Page 87/192 | < Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >

  • In Bjarne's book.

    - by atch
    Guys in one of excersises (ch.5,e.8) from TC++PL Bjarne asks to do following: '"Run some tests to see if your compiler really generates equivalent code for iteration using pointers and iteration using indexing. If different degrees of optimization can be requested, see if and how that affects the quality of the generated code"' Any idea how to eat it and with what? Thanks in advice.

    Read the article

  • Testing JavaMail Related Modules

    - by Hamza Yerlikaya
    Part of my application depends on JavaMail, moving arranging messages etc. Is it possible to test this module without firing a IMAP server to run the tests on? I am always stuck when it comes to testing stuff that depends on external servers or modules.

    Read the article

  • Can simple javascript inheritance be simplified even further?

    - by Will
    John Resig (of jQuery fame) provides a concise and elegant way to allow simple JavaScript inheritance. It was so short and sweet, in fact, that it inspired me to try and simplify it even further (see code below). I've modified his original function such that it still passes all his tests and has the potential advantage of: readability (50% less code) simplicity (you don't have to be a ninja to understand it) performance (no extra wrappers around super/base method calls) consistency with C#'s base keyword Because this seems almost too good to be true, I want to make sure my logic doesn't have any fundamental flaws/holes/bugs, or if anyone has additional suggestions to improve or refute the code (perhaps even John Resig could chime in here!). Does anyone see anything wrong with my approach (below) vs. John Resig's original approach? if (!window.Class) { window.Class = function() {}; window.Class.extend = function(members) { var prototype = new this(); for (var i in members) prototype[i] = members[i]; prototype.base = this.prototype; function object() { if (object.caller == null && this.initialize) this.initialize.apply(this, arguments); } object.constructor = object; object.prototype = prototype; object.extend = arguments.callee; return object; }; } And the tests (below) are nearly identical to the original ones except for the syntax around base/super method calls (for the reason enumerated above): var Person = Class.extend( { initialize: function(isDancing) { this.dancing = isDancing; }, dance: function() { return this.dancing; } }); var Ninja = Person.extend( { initialize: function() { this.base.initialize(false); }, dance: function() { return this.base.dance(); }, swingSword: function() { return true; } }); var p = new Person(true); alert("true? " + p.dance()); // => true var n = new Ninja(); alert("false? " + n.dance()); // => false alert("true? " + n.swingSword()); // => true alert("true? " + (p instanceof Person && p instanceof Class && n instanceof Ninja && n instanceof Person && n instanceof Class));

    Read the article

  • programming from a usb stick, .net

    - by dean nolan
    I was wondering if there was a way I could program and compile .net applications (c#, asp.net mvc) from a usb stick on any laptop I plugged in. I am lookinjg for a solution that does not have me installing programs on the laptop, so I have to be able to run an ide or editor from an exe and compile presumably from command line. Was also wondering if I can run MS test projects from command line to check tests passed etc. Thanks

    Read the article

  • Testing a Generic Class

    - by Jonas Gorauskas
    More than a question, per se, this is an attempt to compare notes with other people. I wrote a generic History class that emulates the functionality of a browser's history. I am trying to wrap my head around how far to go when writing unit tests for it. I am using NUnit. Please share your testing approaches below. The full code for the History class is here (http://pastebin.com/ZGKK2V84).

    Read the article

  • Sources of latency in sending-receiving tcp/udp packets in linux

    - by osgx
    Hello What are sources of latency in process of sending/receiving tcp/udp packets in linux 2.6 ? I want to know a latency sources in "ping-pong" latency tests. There are some rather good papers of ethernet latency, but they cover only latency sources in the wire and switch (and rather cursory, only for specific switch). What steps of processing does follow a packet? Papers with deep latency analysis of usual ping (icmp) will be useful too. I rely on community :)

    Read the article

  • ignore extra spaces when using fgets

    - by Gary
    Hi, I'm using fgets with stdin to read in some data, with the max length i read in being 25. With one of the tests I'm running on this code, there are a few hundred spaces after the data that I want - which causes the program to fail. Can someone advise me as to how to ignore all of these extra spaces when using fgets and go to the next line? Thanks

    Read the article

  • Traditional IO vs memory-mapped

    - by Senne
    I'm trying to illustrate the difference in performance between traditional IO and memory mapped files in java to students. I found an example somewhere on internet but not everything is clear to me, I don't even think all steps are nececery. I read a lot about it here and there but I'm not convinced about a correct implementation of neither of them. The code I try to understand is: public class FileCopy{ public static void main(String args[]){ if (args.length < 1){ System.out.println(" Wrong usage!"); System.out.println(" Correct usage is : java FileCopy <large file with full path>"); System.exit(0); } String inFileName = args[0]; File inFile = new File(inFileName); if (inFile.exists() != true){ System.out.println(inFileName + " does not exist!"); System.exit(0); } try{ new FileCopy().memoryMappedCopy(inFileName, inFileName+".new" ); new FileCopy().customBufferedCopy(inFileName, inFileName+".new1"); }catch(FileNotFoundException fne){ fne.printStackTrace(); }catch(IOException ioe){ ioe.printStackTrace(); }catch (Exception e){ e.printStackTrace(); } } public void memoryMappedCopy(String fromFile, String toFile ) throws Exception{ long timeIn = new Date().getTime(); // read input file RandomAccessFile rafIn = new RandomAccessFile(fromFile, "rw"); FileChannel fcIn = rafIn.getChannel(); ByteBuffer byteBuffIn = fcIn.map(FileChannel.MapMode.READ_WRITE, 0,(int) fcIn.size()); fcIn.read(byteBuffIn); byteBuffIn.flip(); RandomAccessFile rafOut = new RandomAccessFile(toFile, "rw"); FileChannel fcOut = rafOut.getChannel(); ByteBuffer writeMap = fcOut.map(FileChannel.MapMode.READ_WRITE,0,(int) fcIn.size()); writeMap.put(byteBuffIn); long timeOut = new Date().getTime(); System.out.println("Memory mapped copy Time for a file of size :" + (int) fcIn.size() +" is "+(timeOut-timeIn)); fcOut.close(); fcIn.close(); } static final int CHUNK_SIZE = 100000; static final char[] inChars = new char[CHUNK_SIZE]; public static void customBufferedCopy(String fromFile, String toFile) throws IOException{ long timeIn = new Date().getTime(); Reader in = new FileReader(fromFile); Writer out = new FileWriter(toFile); while (true) { synchronized (inChars) { int amountRead = in.read(inChars); if (amountRead == -1) { break; } out.write(inChars, 0, amountRead); } } long timeOut = new Date().getTime(); System.out.println("Custom buffered copy Time for a file of size :" + (int) new File(fromFile).length() +" is "+(timeOut-timeIn)); in.close(); out.close(); } } When exactly is it nececary to use RandomAccessFile? Here it is used to read and write in the memoryMappedCopy, is it actually nececary just to copy a file at all? Or is it a part of memorry mapping? In customBufferedCopy, why is synchronized used here? I also found a different example that -should- test the performance between the 2: public class MappedIO { private static int numOfInts = 4000000; private static int numOfUbuffInts = 200000; private abstract static class Tester { private String name; public Tester(String name) { this.name = name; } public long runTest() { System.out.print(name + ": "); try { long startTime = System.currentTimeMillis(); test(); long endTime = System.currentTimeMillis(); return (endTime - startTime); } catch (IOException e) { throw new RuntimeException(e); } } public abstract void test() throws IOException; } private static Tester[] tests = { new Tester("Stream Write") { public void test() throws IOException { DataOutputStream dos = new DataOutputStream( new BufferedOutputStream( new FileOutputStream(new File("temp.tmp")))); for(int i = 0; i < numOfInts; i++) dos.writeInt(i); dos.close(); } }, new Tester("Mapped Write") { public void test() throws IOException { FileChannel fc = new RandomAccessFile("temp.tmp", "rw") .getChannel(); IntBuffer ib = fc.map( FileChannel.MapMode.READ_WRITE, 0, fc.size()) .asIntBuffer(); for(int i = 0; i < numOfInts; i++) ib.put(i); fc.close(); } }, new Tester("Stream Read") { public void test() throws IOException { DataInputStream dis = new DataInputStream( new BufferedInputStream( new FileInputStream("temp.tmp"))); for(int i = 0; i < numOfInts; i++) dis.readInt(); dis.close(); } }, new Tester("Mapped Read") { public void test() throws IOException { FileChannel fc = new FileInputStream( new File("temp.tmp")).getChannel(); IntBuffer ib = fc.map( FileChannel.MapMode.READ_ONLY, 0, fc.size()) .asIntBuffer(); while(ib.hasRemaining()) ib.get(); fc.close(); } }, new Tester("Stream Read/Write") { public void test() throws IOException { RandomAccessFile raf = new RandomAccessFile( new File("temp.tmp"), "rw"); raf.writeInt(1); for(int i = 0; i < numOfUbuffInts; i++) { raf.seek(raf.length() - 4); raf.writeInt(raf.readInt()); } raf.close(); } }, new Tester("Mapped Read/Write") { public void test() throws IOException { FileChannel fc = new RandomAccessFile( new File("temp.tmp"), "rw").getChannel(); IntBuffer ib = fc.map( FileChannel.MapMode.READ_WRITE, 0, fc.size()) .asIntBuffer(); ib.put(0); for(int i = 1; i < numOfUbuffInts; i++) ib.put(ib.get(i - 1)); fc.close(); } } }; public static void main(String[] args) { for(int i = 0; i < tests.length; i++) System.out.println(tests[i].runTest()); } } I more or less see whats going on, my output looks like this: Stream Write: 653 Mapped Write: 51 Stream Read: 651 Mapped Read: 40 Stream Read/Write: 14481 Mapped Read/Write: 6 What is makeing the Stream Read/Write so unbelievably long? And as a read/write test, to me it looks a bit pointless to read the same integer over and over (if I understand well what's going on in the Stream Read/Write) Wouldn't it be better to read int's from the previously written file and just read and write ints on the same place? Is there a better way to illustrate it? I've been breaking my head about a lot of these things for a while and I just can't get the whole picture..

    Read the article

  • LINQ Equivalent for Standard Deviation

    - by Steven
    Does LINQ model the aggregate SQL function STDDEV() (standard deviation)? If not, what is the simplest / best-practices way to calculate it? Example: SELECT test_id, AVERAGE(result) avg, STDDEV(result) std FROM tests GROUP BY test_id

    Read the article

  • informix odbc connection slow to open in asp.net

    - by zombiegx
    I have an application that takes a long time to open odbc connections (like 20 sec) also takes forever using arcmap and arcsde but when I try the connection on the odbc data source administrator, it tests it really fast Does anyone have any idea of what my be causing this? btw the application works fine in another computer with another database thanks.

    Read the article

  • Specification Pattern vs Spec in BDD

    - by cadmium
    I'm trying to explore Behavior Driven Design and Domain Driven Design. I'm getting that written specifications drive the tests in BDD, but also that business logic can be encapsulated using the specification pattern for re-use in domain objects and repositories, etc. Are these basically the same concept just used in different ways, used in conjunction together, or am I completely confusing the concepts? Please shed some light, if possible.

    Read the article

  • GUI Application becomes unresponsive while http request is being done

    - by JW
    I have just made my first proper little desktop GUI application that basically wraps a GUI (php-gtk) interface around a SimpleTest Web Test case to make it act as a remote testing client. Each time the local The Web Test case runs, it sends an HTTP request to another SimpleTest case (that has an XHTML interface) sitting on my server. The application, allows me to run one local test that collates information from multiple remote tests. It just has a 'Start Test' button, 'Stop Test' button and a setting to increase/decrease the number of remote tests conducted in each HTTP Request. Each test-run takes about an hour to complete. The trouble is, most of the time the application is making http requests. Furthermore, whenever an HTTP Request is being made, the application's GUI is unresponsive. I have taken to making the application wait a few seconds (iterating through the Gtk::main_iteration ) between requests in order to give the user time to re-size the window, press the Stop button, etc. But, this makes the whole test run take a lot a longer than is necessary. <?php require_once('simpletest/web_tester.php'); class TestRemoteTestingClient extends WebTestCase { function testRunIterations() { ... $this->assertTrue($this->get($nextUrl), 'getting from pointer:'. $this->_remoteMementoPointer); $this->assertResponse(200, "checking response for " . $nextUrl ); $this->assertText('RemoteNodeGreen'); $this->doGtkIterationsForMinNSeconds($secs); ... } public function doGtkIterationsForMinNSeconds($secs) { $this->appendStatusMessage("Waiting " . $secs); $start = time(); $end = $start + $secs; while( (time() < $end) ) { $this->appendStatusMessage("Waiting " . ($end - time())); while(gtk::events_pending()) Gtk::main_iteration(); } } } Is there a way to keep the application responsive whilst, at the same time making an HTTP request? I am considering splitting the application into two, where: Test Controller Application - Acts as a settings-writer / report-reader and this writes to settings file and reads a report file. Test Runner Application - Acts as a settings-reader / report-writer and, for each iteration reads the settings file, Runs the test, then write a report. So to tell it to close down - I'd: Press the Stop Button on the 'Test Controller Application', which writes to the settings file, which is read by the 'Test Runner Application' which stops, then writes to the report file to say it stopped the 'Test Controller Application' reads the report and updates the status and so on... However, before I go ahead and split the application in two - I am wondering if there is any other obvious way to deal with, this issue. I suspect it is probably quite common and a well-trodden path. Also is there an easier way to send messages between two applications?

    Read the article

  • TDD and management

    - by Pierreten
    My manager is starting to get pretty pissed off that I'm devoting time to designing tests (he sees testing as something you do after the software is written.). His do I convince him otherwise?

    Read the article

  • WatiN, VBScript

    - by little_devil
    I am using WatiN for my web tests. In one of my web pages, I have a VBScript function which opens a dialogbox. I am unable to access it using WatiN. I tried using WatiNTestRecorder; it was unsuccessful. I tried this also: http://blogs.dovetailsoftware.com/blogs/kmiller/archive/2008/07/16/scenario-testing-with-watin.aspx, unsuccessful again.

    Read the article

  • How do I display the validation which failed in my Rails unit test?

    - by JD
    Summary: Failed unit tests tell me which assert (file:line) failed, but not which validation resulted in the failure. More info: I have 11 validations in one of my models. Unit testing is great, whether I run `rake test:units --trace' or 'ruby -Itest test/unit/mymodel_test.rb'. However, despite the fact that it tells me exactly which assert() failed me, I am not told which validation failed. I must be missing something obvious, because I can't ask Google this question well enough to get an answer. Thanks :)

    Read the article

  • Is there a way to avoid debugger?

    - by Gabriel Šcerbák
    I don't like debugging in a debugger, because I think it is often below the abstraction layer of the programming language and it is often not reproducible. I favor usign unit tests when possible and I think they are a good way, but it is not always that easy to implement them. Do you know about any other alternative approaches to avoid the use of debugger?

    Read the article

  • Python : get all exe files in current directory and run them?

    - by Maciek
    Hey all, First of all this is not homework, I'm in a desperate need for a script that will do the following, my problem is, I've never had to deal with python before so I barely know how to use it - and I need it to launch unit tests in TeamCity via a commandline build runner What I need exactly is : a *.bat file that will run the script a python script that will : get all *_test.exe files in the current working directory run all the files which were the result of the search Best regards

    Read the article

  • Handling national language prefix for checkconstraints

    - by Chris Chilvers
    I'm trying to create a check constraint such as CHECK Type IN (N'Create', N'Remove') for an enumeration's value. Sqlite complains about this syntax and only accepts CHECK Type IN ('Create', 'Remove'). The main database will be Sql Server 2005, but I use sqlite's in memory database for unit tests. Is there any way to get sqlite to recognise the national language (N) prefix? Alternatively, is there an easy way when using FluentNHibernate to adapt an nvarchar constant to match the database's dialect?

    Read the article

  • Powermock Slows Down Test Startup on Eclipse/Fedora 10 when on NTFS partition

    - by MrWiggles
    I've just started having a proper play with Powermock and noticed that it slows down test startup immensely. A quick look at top while it was running shows that mount.nfts-3g was taking up most of the CPU. I moved Eclipse and my source directory to ext3 partitions to see if that was a problem and the tests now startup quicker but there's still a noticeable delay. Is this normal with Powermock or am I missing something obvious?

    Read the article

< Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >