Search Results

Search found 28765 results on 1151 pages for 'software testing'.

Page 17/1151 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Survey: how do you unit test your T-SQL?

    - by Alexander Kuznetsov
    How do you unit test your T-SQL? Which libraries/tools do you use? What percentage of your code is covered by unit tests and how do you measure it? Do you think the time and effort which you invested in your unit testing harness has paid off or not? Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • Software management for 2 programmers

    - by kajo
    Hi all, me and my very good friend do a small bussiness. We have company and we develop web apps using Scala. We have started 3 months ago and we have a lot of work now. We cannot afford to employ another programmer because we can't pay him now. Until now we try to manage entire developing process very simply. We use excel sheets for simple bug tracking and we work on client requests on the fly. We have no plan for next week or something similar. But now I find it very inefficient and useless. I am trying to find some rules or some methodology for small team or for only two guys. For example Scrum is, imo, unadapted for us. There are a lot of roles (ScrumMaster, Product Owner, Team...) and it seems overkill. Can you something advise me? Have you any experiences with software management in small teams? Is any methodology of current agile development fitten for pair of programmers? Is there any software management for simple bug tracking, maybe wiki or time management for two coders? thanks a lot for sharing.

    Read the article

  • Junit: splitting integration test and Unit tests.

    - by jeff porter
    Hello all, I've inherited a load of Junit test, but these tests (apart from most not working) are a mixture of actual unit test and integration tests (requiring external systems, db etc). So I'm trying to think of a way to actually separate them out, so that I can run the unit test nice and quickly and the integration tests after that. The options are.. 1: Split them into separate directories. 2: Move to Junit4 and annotate the classes to separate them. 3: Use a file naming convention to tell what a class is , i.e. AdapterATest and AdapterAIntergrationTest. 3 has the issue that Eclipse has the option to "Run all tests in the selected project/package or folder". So it would make it very hard to just run the integration tests. 2: runs the risk that developers might start writing integration tests in unit test classes and it just gets messy. 1: Seems like the neatest solution, but my gut says there must be a better solution out there. So that is my question, how do you lot break apart integration tests and proper unit tests?

    Read the article

  • ubuntu 12.04 not responding after software update

    - by Loukas
    Yesterday, I did a software update and since then the system became unusable. To be more precise, I login and then I see only the background image without any bars, images etc.. This happened after updating the system. Please find here the relevant part /var/log/apt/history.log. I'm running 64 bit 12.04 distribution on nVidia. The 2-3 first restarts after the upgrade it was complaining about compiz/unity crashes and then the situation is as described above. I would be the most grateful of you could help me on this. Best regards, Loukas Start-Date: 2012-06-05 10:31:42 Commandline: aptdaemon role='role-commit-packages' sender=':1.77' Upgrade: libpolkit-backend-1-0:amd64 (0.104-1, 0.104-1ubuntu1), libgnome-control-center1:amd64 (3.4.1-0ubuntu2, 3.4.2-0ubuntu0.2), libgl1-mesa-dri:amd64 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libgl1-mesa-dri:i386 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libgl1-mesa-glx:amd64 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libgl1-mesa-glx:i386 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libpolkit-gobject-1-0:amd64 (0.104-1, 0.104-1ubuntu1), gnome-control-center-data:amd64 (3.4.1-0ubuntu2, 3.4.2-0ubuntu0.2), libglapi-mesa:amd64 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libglapi-mesa:i386 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), policykit-1:amd64 (0.104-1, 0.104-1ubuntu1), software-center:amd64 (5.2.2.1, 5.2.2.2), update-manager:amd64 (0.156.14.4, 0.156.14.5), update-manager-core:amd64 (0.156.14.4, 0.156.14.5), libxatracker1:amd64 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libpolkit-agent-1-0:amd64 (0.104-1, 0.104-1ubuntu1), gnome-control-center:amd64 (3.4.1-0ubuntu2, 3.4.2-0ubuntu0.2), libglu1-mesa:amd64 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1), libglu1-mesa:i386 (8.0.2-0ubuntu3, 8.0.2-0ubuntu3.1) End-Date: 2012-06-05 10:32:51

    Read the article

  • ubuntu software center not working after update

    - by HOS
    today I'm updated my Ubuntu (I have KDE Desktop installed on it before) & After update , my Ubuntu software center said : items cannot be installed or removed until the package catalog is repaired .do you want to repair it ? and when I'm clicked on Repair ,after 2 seconds it says : the installation or removal of software package Failed , with this details : installArchives() failed: dpkg: dependency problems prevent configuration of kdm: kdm depends on libkworkspace4abi1 (= 4:4.8.5-0ubuntu0.1); however: Version of libkworkspace4abi1 on system is 4:4.8.5-0ubuntu0.2. kdm depends on kde-workspace-kgreet-plugins (= 4:4.8.5-0ubuntu0.1); however: Version of kde-workspace-kgreet-plugins on system is 4:4.8.5-0ubuntu0.2. dpkg: error processing kdm (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of kubuntu-desktop: kubuntu-desktop depends on kdm; however: Package kdm is not configured yet. dpkg: error processing kubuntu-desktop (--configure): No apport report written because the error message indicates its a followup error from a previous failure. No apport report written because the error message indicates its a followup error from a previous failure. dependency problems - leaving unconfigured Errors were encountered while processing: kdm kubuntu-desktop Error in function: dpkg: dependency problems prevent configuration of kdm: kdm depends on libkworkspace4abi1 (= 4:4.8.5-0ubuntu0.1); however: Version of libkworkspace4abi1 on system is 4:4.8.5-0ubuntu0.2. kdm depends on kde-workspace-kgreet-plugins (= 4:4.8.5-0ubuntu0.1); however: Version of kde-workspace-kgreet-plugins on system is 4:4.8.5-0ubuntu0.2. dpkg: error processing kdm (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of kubuntu-desktop: kubuntu-desktop depends on kdm; however: Package kdm is not configured yet. dpkg: error processing kubuntu-desktop (--configure): dependency problems - leaving unconfigured please help me i love Ubuntu & i want to repair it . Thanks

    Read the article

  • Questions to ask interviewer in an Interview

    - by chota
    Hello All, I have an SDET interview upcoming week. I have been preparing since long. It is a good company. I am working as SDET since two year. I wonder what questions should i ask to my interviewer regarding testing and other thing. I would appreciate your help if you give me some sample questions that i should ask to my interviewer during the interview. Some of them i thoughts are a) What type of testing methodologies do you use? Do you have triage meeting everyday? What percentage of code coverage is done by unit tests? I do not find these questions to be more effective, i would appreciate if somebody could help me out in coming out with better question?

    Read the article

  • Recurring lsb-release and Software Center glitch after installing MATE

    - by infomorph
    I just recently upgraded to Ubuntu 11.10. Not a fan of Unity, so I decided to try out the MATE desktop from Linux Mint. I added the Mint repository, grabbed and installed the MATE packages, and got rid of the repo so I wouldn't be downloading any other Mint packages. I did have some glitches with the packages (missing dependency stuff), but I fixed it. As other users have reported, installing MATE temporarily breaks the Ubuntu Software Center because lsb_release shows the machine as Linux Mint rather than Ubuntu. I can fix it as noted in this answer by editing /etc/*release and /etc/*issue. Problem is, this only works until I reboot the machine. Every time I reboot, /etc/lsb-release and /etc/issue revert to Linux Mint, breaking Software Center again until I edit them, again. Can anyone help me pin down what keeps changing these files? Much appreciated, thanks. Rephrasing the crux of the problem: where do /etc/lsb-release and /etc/issue get their info from? What would cause them to be revised on reboot?

    Read the article

  • which are the different ways i can update software catalog?

    - by Manish Kumar Chauhan
    while facing problem(s) with software center 5.2.6 on ubuntu 12.04, i reinstalled the software center and executed following command on gnome terminal $ sudo dpkg --configure -a Setting up software-center (5.2.6) ... Updating software catalog...**this may take a moment.** However there is no or little beyond this point. Is there any other way to update software catalog? because every other time i open up software center it keeps on crashing.

    Read the article

  • Is there a real difference between dynamic analysis and testing?

    - by user970696
    Often testing is regarded as a dynamic analysis of a software. Yet while writing my thesis, the reviewer noted to me that dynamic analysis is about analyzing the program behind the scenes - e.g. profiling and that it is not the same as testing because its "analysis" which looks inside and observes. I know that "static analysis" is not testing, should we then separate this "dynamic analysis" also from testing? Some books do refer to dynamic analysis in this sense. I would maybe say that testing is a one mean of dynamic analysis?

    Read the article

  • Advantage of Software Development [on hold]

    - by user93319
    The worth of a brand that a corporation carries is way too high than its physical presence. If you are venturing into an online business, you need to take special care about the corporate image of your business. Nowadays, it is very important for every organization to have its own website. To enhance the online presence of a company it is important to have a good website design as well as the blueprint of the design. A quality site can enhance organizational growth and it can lend a hand an in achieving company’s goals promptly. When websites have gained so much meaning, it is advisable for an organization to seek professional support for the construction of its own exclusive portals. Expert help may again is essential when one needs for a complete renovation on the Web. Any group is necessary to do a bit of introspection before it make a decision to look for the services of a professional web software development company. It is good to be completely clear-headed regarding one’s requirements. To start with, a business should be familiar with who its potential clientele are. Once this main factor has been give consideration, an association can go ahead and get its website designed accordingly. On approaching a corporation that offers software development services India to its customers, a client can make sure that their site is ready with all the most up-to-date features. Professional Assistance Matters Professional service supplier is identified to furnish a site with easy to use features that prompt visitors to come back again and again. Yet one more benefit of receiving aid from professionals is that they can let you know of the type of content that you should place on display over your site. For example, a business that wants to draw the interest of experts belong to the corporate world must make sure that the language used on its website is crisp and official.

    Read the article

  • Rebuilding CoasterBuzz, Part IV: Dependency injection, it's what's for breakfast

    - by Jeff
    (Repost from my personal blog.) This is another post in a series about rebuilding one of my Web sites, which has been around for 12 years. I hope to relaunch soon. More: Part I: Evolution, and death to WCF Part II: Hot data objects Part III: The architecture using the "Web stack of love" If anything generally good for the craft has come out of the rise of ASP.NET MVC, it's that people are more likely to use dependency injection, and loosely couple the pieces parts of their applications. A lot of the emphasis on coding this way has been to facilitate unit testing, and that's awesome. Unit testing makes me feel a lot less like a hack, and a lot more confident in what I'm doing. Dependency injection is pretty straight forward. It says, "Given an instance of this class, I need instances of other classes, defined not by their concrete implementations, but their interfaces." Probably the first place a developer exercises this in when having a class talk to some kind of data repository. For a very simple example, pretend the FooService has to get some Foo. It looks like this: public class FooService {    public FooService(IFooRepository fooRepo)    {       _fooRepo = fooRepo;    }    private readonly IFooRepository _fooRepo;    public Foo GetMeFoo()    {       return _fooRepo.FooFromDatabase();    } } When we need the FooService, we ask the dependency container to get it for us. It says, "You'll need an IFooRepository in that, so let me see what that's mapped to, and put it in there for you." Why is this good for you? It's good because your FooService doesn't know or care about how you get some foo. You can stub out what the methods and properties on a fake IFooRepository might return, and test just the FooService. I don't want to get too far into unit testing, but it's the most commonly cited reason to use DI containers in MVC. What I wanted to mention is how there's another benefit in a project like mine, where I have to glue together a bunch of stuff. For example, when I have someone sign up for a new account on CoasterBuzz, I'm actually using POP Forums' new account mailer, which composes a bunch of text that includes a link to verify your account. The thing is, I want to use custom text and some other logic that's specific to CoasterBuzz. To accomplish this, I make a new class that inherits from the forum's NewAccountMailer, and override some stuff. Easy enough. Then I use Ninject, the DI container I'm using, to unbind the forum's implementation, and substitute my own. Ninject uses something called a NinjectModule to bind interfaces to concrete implementations. The forum has its own module, and then the CoasterBuzz module is loaded second. The CB module has two lines of code to swap out the mailer implementation: Unbind<PopForums.Email.INewAccountMailer>(); Bind<PopForums.Email.INewAccountMailer>().To<CbNewAccountMailer>(); Piece of cake! Now, when code asks the DI container for an INewAccountMailer, it gets my custom implementation instead. This is a lot easier to deal with than some of the alternatives. I could do some copy-paste, but then I'm not using well-tested code from the forum. I could write stuff from scratch, but then I'm throwing away a bunch of logic I've already written (in this case, stuff around e-mail, e-mail settings, mail delivery failures). There are other places where the DI container comes in handy. For example, CoasterBuzz does a number of custom things with user profiles, and special content for paid members. It uses the forum as the core piece to managing users, so I can ask the container to get me instances of classes that do user lookups, for example, and have zero care about how the forum handles database calls, configuration, etc. What a great world to live in, compared to ten years ago. Sure, the primary interest in DI is around the "separation of concerns" and facilitating unit testing, but as your library grows and you use more open source, it starts to be the glue that pulls everything together.

    Read the article

  • Why can't I install from software center?

    - by user64720
    There was a problem upgrading to Firefox 13. This error kept returning: /var/cache/apt/archives/firefox_13.0+build1-0ubuntu0.12.04.1_i386.deb W: Waited for dpkg --assert-multi-arch but was not there - dpkgGo (10: There are no "child" processes). Now it seems that there is some problem with dpkg and I can't install anything from software center. I already tried to clean previous packages with sudo rm /var/lib/apt/lists/* -vf and then sudo apt-get update, it didn't work. When running sudo dpkg --configure -a, I get this: dpkg: problems with dependencies prevent the configuration of firefox-globalmenu: firefox-globalmenu depends on firefox (= 13.0+build1-0ubuntu0.12.04.1); however: The package is not installed. dpkg: error while processing firefox-globalmenu (--configure): problems com dependencies - leaving unconfigured There has been found errors while processing: firefox-globalmenu What should I do to fix this?? EDIT: I don't have the necessary expertise to understand why what I did worked and what was causing the conflict, but anyway, since there was a problem with firefox-globalmenu:, I went to synaptics package manager, I removed this particular package and reinstalled it. After that, I was able to install Firefox from synaptics and also any other applications from software center. However, still there was a problem, when running sudo apt-get update, the following kept returning: Failed to get gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages Verification code hash doesn't match. E: Some archives index failed at being downloaded. They have been ignored, or older copies are used instead. So I typed sudo rm /var/lib/apt/lists/* -vf in terminal and then again sudo apt-get update and everything is fine now. I did this before an answer was posted, anyway I agree the problem was that particular package and its removal. So I'll mark the below answer as accepted.

    Read the article

  • How long to wait before Humble Bundle games appear in Software Centre?

    - by Synesso
    At software-center.ubuntu.com it says Thank you for downloading your Humble Bundle games from the Ubuntu Software Center Notes: As these games have been recently added to Ubuntu Software Center it might take a minute for them to appear. If you see a "Not found" message, Ubuntu Software Center is working in the background to update the list of available apps. When I click on a link, the software centre opens and it says There isn’t a software package called “swordandsworcery” in your current software sources. I have waited for about 30 minutes now. I have also executed sudo apt-get update & restarted software centre to no avail. Do I keep waiting?

    Read the article

  • Test Doubles : Do they go in "source packages" or "test packages"?

    - by sbrattla
    I've got a couple of data access objects (DefaultPersonServices.class, DefaultAddressServices.class) which is responsible for various CRUD operations in a database. A few different classes use these services, but as the services requires that a connection is established with a database I can't really use them in unit tests as they take too long. Thus, I'd like to create a test doubles for them and simply do FakePersonServices.class and FakeAddressService.class implementations which I can use throughout testing. Now, this is all good (I assume)...but my question relates to where I put the test doubles. Should I keep them along with the default implementations (aka "real" implementations) or should I keep them in a corresponding test package. The default implementations are found in Source Packages : com.company.data.services. Should I keep the test doubles here too, or should the test doubles rather be in Test Packages : com.company.data.services?

    Read the article

  • Can I use other ways than Ubuntu Software Center to open `apt` links?

    - by cipricus
    Without Ubuntu Software Center in Lubuntu I was unable to edit opening apt links in any program in Firefox (see this question) After installing Ubuntu Software Center, that problem is solved, but could I use other program instead of Ubuntu Software Center for the same purpose? I find it too heavy, and to install I prefer the Terminal, gdebi, Lubuntu Software Center or the Synaptic. (Now that I have the apt option in Firefox/Preferences/Applications, I try to change Ubuntu Software Center to Lubuntu Software Center but this does not change the option.)

    Read the article

  • What Are Some Tips For Writing A Large Number of Unit Tests?

    - by joshin4colours
    I've recently been tasked with testing some COM objects of the desktop app I work on. What this means in practice is writing a large number (100) unit tests to test different but related methods and objects. While the unit tests themselves are fairly straight forward (usually one or two Assert()-type checks per test), I'm struggling to figure out the best way to write these tests in a coherent, organized manner. What I have found is that copy and Paste coding should be avoided. It creates more problems than it's worth, and it's even worse than copy-and-paste code in production code because test code has to be more frequently updated and modified. I'm leaning toward trying an OO-approach using but again, the sheer number makes even this approach daunting from an organizational standpoint due to concern with maintenance. It also doesn't help that the tests are currently written in C++, which adds some complexity with memory management issues. Any thoughts or suggestions?

    Read the article

  • If you should only have one assertion per test; how to test multiple inputs?

    - by speg
    I'm trying to build up some test cases, and have read that you should try and limit the number of assertions per test case. So my question is, what is the best way to go about testing a function w/ multiple inputs. For example, I have a function that parses a string from the user and returns the number of minutes. The string can be in the form "5w6h2d1m", where w, h, d, m correspond to the number of weeks, hours, days, and minutes. If I wanted to follow the '1 assertion per test rule' I'd have to make multiple tests for each variation of input? That seems silly so instead I just have something like: self.assertEqual(parse_date('5m'), 5) self.assertEqual(parse_date('5h'), 300) self.assertEqual(parse_date('5d') ,7200) self.assertEqual(parse_date('1d4h20m'), 1700) In the one test case. Is there a better way?

    Read the article

  • How to write automated tests for SQL queries?

    - by James
    The current system we are adopting at work is to write some extremely complex queries which perform multiple calculations and have multiple joins / sub-queries. I don't think I am experienced enough to say if this is correct or not so I am agreeing and attempting to function with this system as it has clear benefits. The problem we are having at the moment is that the person writing the queries makes a lot of mistakes and assumes everything is correct. We have now assigned a tester to analyse all of the queries but this still proves extremely time consuming and stressful. I would like to know how we could create an automated procedure (without specifically writing it with code if possible as I can work out how to do that the long way) to verify a set of 10+ different inputs, verify the output data and say if the calculations are correct. I know I could write a script using specific data in the database and create a script using c# (the db is SQL Server) and verify all the values coming out but I would like to know what the official "standard" is as my experience is lacking in this area and I would like to improve. I am happy to add more information if required, add a comment if necessary. Thank you. Edit: I am using c#

    Read the article

  • What does well written, readable tests look like?

    - by Industrial
    Doing unit testing for the first time at a large scale, I find myself writing a lot of repetitive unit tests for my business logic. Sure, to create complete test suites I need to test all possibilities but readability feels compromised doing what I do - as shown in the psuedocode below. How would a well written, readable test suit look like? describe "UserEntity" -> it "valid name validates" ... it "invalid name doesnt validate" ... it "valid list of followers validate" ..

    Read the article

  • SQL University: What and why of database testing

    - by Mladen Prajdic
    This is a post for a great idea called SQL University started by Jorge Segarra also famously known as SqlChicken on Twitter. It’s a collection of blog posts on different database related topics contributed by several smart people all over the world. So this week is mine and we’ll be talking about database testing and refactoring. In 3 posts we’ll cover: SQLU part 1 - What and why of database testing SQLU part 2 - What and why of database refactoring SQLU part 2 – Tools of the trade With that out of the way let us sharpen our pencils and get going. Why test a database The sad state of the industry today is that there is very little emphasis on testing in general. Test driven development is still a small niche of the programming world while refactoring is even smaller. The cause of this is the inability of developers to convince themselves and their managers that writing tests is beneficial. At the moment they are mostly viewed as waste of time. This is because the average person (let’s not fool ourselves, we’re all average) is unable to think about lower future costs in relation to little more current work. It’s orders of magnitude easier to know about the current costs in relation to current amount of work. That’s why programmers convince themselves testing is a waste of time. However we have to ask ourselves what tests are really about? Maybe finding bugs? No, not really. If we introduce bugs, we’re likely to write test around those bugs too. But yes we can find some bugs with tests. The main point of tests is to have reproducible repeatability in our systems. By having a code base largely covered by tests we can know with better certainty what a small code change can break in other parts of the system. By having repeatability we can make code changes with confidence, since we know we’ll see what breaks in other tests. And here comes the inability to estimate future costs. By spending just a few more hours writing those tests we’d know instantly what broke where. Imagine we fix a reported bug. We check-in the code, deploy it and the users are happy. Until we get a call 2 weeks later about a certain monthly process has stopped working. What we don’t know is that this process was developed by a long gone coworker and for some reason it relied on that same bug we’ve happily fixed. There’s no way we could’ve known that. We say OK and go in and fix the monthly process. But what we have no clue about is that there’s this ETL job that relied on data from that monthly process. Now that we’ve fixed the process it’s giving unexpected (yet correct since we fixed it) data to the ETL job. So we have to fix that too. But there’s this part of the app we coded that relies on data from that exact ETL job. And just like that we enter the “Loop of maintenance horror”. With the loop eventually comes blame. Here’s a nice tip for all developers and DBAs out there: If you make a mistake man up and admit to it. All of the above is valid for any kind of software development. Keeping this in mind the database is nothing other than just a part of the application. But a big part! One reason why testing a database is even more important than testing an application is that one database is usually accessed from multiple applications and processes. This makes it the central and vital part of the enterprise software infrastructure. Knowing all this can we really afford not to have tests? What to test in a database Now that we’ve decided we’ll dive into this testing thing we have to ask ourselves what needs to be tested? The short answer is: everything. The long answer is: read on! There are 2 main ways of doing tests: Black box and White box testing. Black box testing means we have no idea how the system internals are built and we only have access to it’s inputs and outputs. With it we test that the internal changes to the system haven’t caused the input/output behavior of the system to change. The most important thing to test here are the edge conditions. It’s where most programs break. Having good edge condition tests we can be more confident that the systems changes won’t break. White box testing has the full knowledge of the system internals. With it we test the internal system changes, different states of the application, etc… White and Black box tests should be complementary to each other as they are very much interconnected. Testing database routines includes testing stored procedures, views, user defined functions and anything you use to access the data with. Database routines are your input/output interface to the database system. They count as black box testing. We test then for 2 things: Data and schema. When testing schema we only care about the columns and the data types they’re returning. After all the schema is the contract to the out side systems. If it changes we usually have to change the applications accessing it. One helpful T-SQL command when doing schema tests is SET FMTONLY ON. It tells the SQL Server to return only empty results sets. This speeds up tests because it doesn’t return any data to the client. After we’ve validated the schema we have to test the returned data. There no other way to do this but to have expected data known before the tests executes and comparing that data to the database routine output. Testing Authentication and Authorization helps us validate who has access to the SQL Server box (Authentication) and who has access to certain database objects (Authorization). For desktop applications and windows authentication this works well. But the biggest problem here are web apps. They usually connect to the database as a single user. Please ensure that that user is not SA or an account with admin privileges. That is just bad. Load testing ensures us that our database can handle peak loads. One often overlooked tool for load testing is Microsoft’s OSTRESS tool. It’s part of RML utilities (x86, x64) for SQL Server and can help determine if our database server can handle loads like 100 simultaneous users each doing 10 requests per second. SQL Profiler can also help us here by looking at why certain queries are slow and what to do to fix them.   One particular problem to think about is how to begin testing existing databases. First thing we have to do is to get to know those databases. We can’t test something when we don’t know how it works. To do this we have to talk to the users of the applications accessing the database, run SQL Profiler to see what queries are being run, use existing documentation to decipher all the object relationships, etc… The way to approach this is to choose one part of the database (say a logical grouping of tables that go together) and filter our traces accordingly. Once we’ve done that we move on to the next grouping and so on until we’ve covered the whole database. Then we move on to the next one. Database Testing is a topic that we can spent many hours discussing but let this be a nice intro to the world of database testing. See you in the next post.

    Read the article

  • Regression testing for firewall changes

    - by James C
    We have a number of firewalls in place around our organisation and in some cases packets can pass through four levels of firewall limiting the flow TCP traffic. A concept that I'm used to from software testing is regression testing, allowing you to run a test suite against a changed application to verify that the new changes haven't affected any old features. Does anyone have any experience or an offer any solutions to being able to perform the same type of thing with firewall changes and network testing? The problem becomes a lot more complicated because you'd ideally want to be originating (and testing receipt) of packets across many machines.

    Read the article

  • Verfication vs validation again, does testing belong to verification? If so, which?

    - by user970696
    I have asked before and created a lot of controversy so I tried to collect some data and ask similar question again. E.g. V&V where all testing is only validation: http://www.buzzle.com/editorials/4-5-2005-68117.asp According to ISO 12207, testing is done in validation: •Prepare Test Requirements,Cases and Specifications •Conduct the Tests In verification, it mentiones. The code implements proper event sequence, consistent interfaces, correct data and control flow, completeness, appropriate allocation timing and sizing budgets, and error definition, isolation, and recovery. and The software components and units of each software item have been completely and correctly integrated into the software item Not sure how to verify without testing but it is not there as a technique. From IEEE: Verification: The process of evaluating software to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase. [IEEE-STD-610]. Validation: The process of evaluating software during or at the end of the development process to determine whether it satisfies specified requirements. [IEEE-STD-610] At the end of development phase? That would mean UAT.. So the question is, what testing (unit, integration, system, uat) will be considered verification or validation? I do not understand why some say dynamic verification is testing, while others that only validation. An example: I am testing an application. System requirements say there are two fields with max. lenght of 64 characters and Save button. Use case say: User will fill in first and last name and save. When checking the fields and Save button presence, I would say its verification. When I follow the use case, its validation. So its both together, done on the system as a whole.

    Read the article

  • How to verify the code that could take a substantial time to compile? [on hold]

    - by user18404
    As a follow up to my prev question: What is the best aproach for coding in a slow compilation environment To recap: I am stuck with a large software system with which a TDD ideology of "test often" does not work. And to make it even worse the features like pre-compiled headers/multi-threaded compilation/incremental linking, etc is not available to me - hence I think that the best way out would be to add the extensive logging into the system and to start "coding in large chunks", which I understand as code for a two-three hours first (as opposed to 15-20 mins in TDD) - thoroughly eyeball the code for a 15 minutes and only after all that do the compilation and run the tests. As I have been doing TDD for a quite a while, my code eyeballing / code verification skills got rusty (you don't really need this that much if you can quickly verify what you've done in 5 seconds by running a test or two) - so I am after a recommendations on how to learn these source code verification/error spotting skills again. I know I was able to do that easily some 5-10 years ago when I din't have much support from the compiler/unit testing tools I had until recently, thus there should be a way to get back to the basics.

    Read the article

  • Best way to explain to someone that software developers need to install tools (mainly build integrat

    - by leeand00
    I work at a software company where most of the people are afraid to install new tools to increase productivity. They give me excuses like: I don't need to install something else. I can do this myself. etc...many other baseless arguments. In an ecommerece business, the end-users should not have to install anything, everything should be managed by them from the web, and the developers should be the ones installing things to increase productivity and teamwork i.e.: Version Control Systems Build Tools (ANT, NANT, Maven, continuous integration, CSS Frameworks) Integrated Development Environments Frameworks (Unit testing, etc) Etc... How else can I get my point across without sound crass?

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >