Tuesday, February 3, 2009

You Can't Automate User Acceptance Testing

One of the things I hear from clients is "we want our business analysts to be able to automate our user acceptance tests." I cringe every time that phrase gets mentioned. It doesn't work for two reasons.

The first is usually that the analyst does not have the skill set needed to effectively automate a test, and management is usually reluctant to make that kind of investment in them. The second reason is that you can't automate a user acceptance test. You can automate the functionality that makes up that test, make sure the correct dialogs and error messages are displayed at the appropriate time, but the "user acceptance" part only applies if a person with domain knowledge has tried the test and agreed "Yes, this is how the program should work."

I'm afraid that people then get lulled into a false sense of security. "Our automated acceptance tests are passing, so our product meets our users' needs."

What do you think? What have your experiences been with automating user acceptance testing?

3 comments:

  1. I'd like to offer a slight reframe.

    I agree that you can't automate *acceptance*. Only a human can make the decision of whether or not to accept a given feature, product, or system.

    But it is possible to automate a set of pre-defined and agreed upon acceptance tests that represent the business stakeholder's expectations about the behavior of the system.

    And I believe that there is value in doing so. Such automated tests can give us a measurable indicator of our progress on current development and a regression test suite for completed development.

    I also agree that asking non-programming analysts to write automation code is a losing proposition.

    I have seen a whole lot of bad test automation code come from people who didn't know how to code and didn't want to learn how to code but who had been handed a tool and told "go forth and automate."

    Test automation involves programming. I have never seen a test automation effort succeed without someone doing some coding somewhere. Record-and-playback may be a useful tool, but it is not a viable overall test automation strategy.

    But that's why I don't think that test automation activities can be done in isolation. Rather, achieving effective test automation requires the collaboration of the whole team with testers and analysts articulating expectations and developers writing the test automation code.

    So ultimately I agree that business analysts attempting to automate user acceptance testing is doomed to fail.

    But business analysts collaborating with developers to create automated acceptance tests can work really, really well.

    ReplyDelete
  2. You can automate any repetitve process. Part of acceptance testing is often the repeat of regression tests; ideal for automation.

    You cannot automate the realtime intelligent analysis that a skilled tester performs (often sub conciously) when testing.

    Automation should be used to take care of the non intelligent, repetitive work; enabling the tester to spend more time analysing and less button clicking.

    Automation is a component part of a test approach and a service to testers.

    ReplyDelete
  3. Yes, I certainly hope we don't throw the automation baby out with the bathwater. There is so much value in automating tests.

    But I think Nick's point is important. Watch out for "Our automated acceptance tests are passing, so our product meets our users' needs." (cringe) I have had teams ask me to set this up for them, so the acceptor doesn't have to put their hands on the product for acceptance.

    As Elisabeth said, you can't automate "acceptance." There's no getting around the need to have the acceptor try the product, hands on, clicking away.

    ReplyDelete