Friday, April 30, 2010

Time Diffs

During one cycle of performance testing, the tool I was using only logged a test's start and end time. For some reason, it didn't log duration. So you'd see start time 10:34:23 and end time 10:39:00. Now, in the case of a short test, say, one that ran in 15 or 20 mins, that wasn't a big deal, it's easy to do that kind of math in your head. But when the durations were like this: start time 10:32:23 and the end time was 13:18:54, it would take a minute or so for me to calculate how long that was.

This was a frustrating experience, and one that was prone to human error. So I wrote a little C# app that allowed me to enter two times and see the difference between them. This let me note the test durations much faster and more precisely. To build your own time diff tool, fire up Visual Studio and create a form that looks like this:

For reference, I named the Start Time field txtStart, the End time field txtStop, the Calculate button is cmdCalculate, and the Clear button is cmdClear. I was lazy and left the label that you see as 00:00:00 as label1.

Now, use the following code for the calculate button:
private void cmdCalculate_Click(object sender, EventArgs e)
if ((txtStart.Text != "") && (txtStop.Text!=""))
DateTime start = DateTime.Parse(txtStart.Text);
DateTime stop = DateTime.Parse(txtStop.Text);
TimeSpan duration = new TimeSpan();
duration = stop - start;
label1.Text = duration.ToString();

Here's the code for the Clear button:
private void cmdClear_Click(object sender, EventArgs e)
txtStart.Text = "";
txtStop.Text = "";
label1.Text = "00:00:00";

And that's it. You now have a little app that will calculate the differences in time for you. One thing to note here is that the app isn't intended to diff times that are greater than 24 hours, and you have to enter the times in a 24 hour clock format. So if your test ended at 2 pm, you'd enter 14:00:00.

This is a nice example of creating a tool that augments manual testing. It doesn't do any testing on its own, but it does let a tester perform a given activity faster and more efficiently. Always be looking for little opportunities like this, and your testers will thank you for them.

Wednesday, April 28, 2010

Focus on Test Approach, Not Test Tools

There's a lot of news flying around right now about how powerpoint is making us stupid. Thing is, I don't think this is a powerpoint problem, I think it's a people problem. Slides probably aren't the best method to communicate this information, and that certainly needs to be remedied. So it's the approach that these presenters are taking that's the problem, not powerpoint itself.

This got me thinking about some of the negativity that comes up when the honeymoon period with a test automation tool ends. I've seen teams whose intent/dream/goal was to create UI automation tests via a record and playback tool. They envisioned doing a single recording and all their tests would then be able to run at the push of a button.

It's a nice dream, but in order to really get the most out of your automation efforts, you really need to know what you're doing. It's impossible to automate all your tests, and quite frankly, you probably don't need to. I once worked for a manager who insisted that all bugs have an automated regression test for them. Think about that for a minute. All bugs. Every one of them. Even the ones that were citing spelling mistakes.

Needless to say, that wasn't a good use of my time, and I'm pretty sure my blood pressure was in the near fatal range for a while there.

When you're thinking about automation, think about what makes the most sense to automate. Smoke test? Yep. Get that going and integrate it with your build process. Regression test? Yep, but don't go crazy. Focus on the primary workflows of your application, don't try to automate a test for every bug you've ever logged. That whizbang new functionality that Sally just rolled into the build? Probably not.

That whizbang new functionality probably should not be automated yet because it's UI will change, and most likely will change a lot over the course of the release cycle. Instead, use automation to augment manual and exploratory testing for this feature. Let's say that Sally's piece is a report generator that pulls data from a database. Build some automation that sets up different data in the db, allowing you to test more types of her reports more quickly. Maybe you can set up a batch file that calls out to a diff tool and compares a report from today's build to a baseline.

The approach you take will largely determine your automation effort's success. Don't assume that just because you bought a record-and-playback tool that you've got test automation covered. The test tool is a dumb computer program; you're the one with the brain. Know when to employ the tools you have, when to use something else, and your testing will go much smoother.

Friday, April 9, 2010

What's in it for me?

As a presenter, that's the question you always have to be asking yourself. Imagine yourself in your audience's position - you are taking time out of your busy workday to view a demonstration. As such, you want to know how this new product or service will benefit you. When you first sit down to hear the presenter speak:
  • You do not care about the presenter's company history.
  • You do not care about the awards the presenter's company has won.
  • You do not care about the demonstrated product's history.

Yet it's amazing how many people start their presentations off with a bunch of slides that agonizingly detail those points. When I watch a demo, I want to know if a product is going to help me do my job effectively. Once I know that it will, then it might be nice to know more about the awards and company history. But that's not what's essential.

The essence of the demo should show the audience how your product will solve a problem they have. Before you present, make sure you understand the audience's pain. Are they performing a tedious task that is prone to errors? Are they trying to speed up their app's performance? Do they need a better way to communicate between mulitple departments?

Once you understand the problem, you can apply your product's features and show how those features turn into benefits for the audience. And that's the key - too many people will show a cool feature in their product, but not state how that feature benefits the audience. Always make sure you state the benefit - "Cool Feature X is important because it allows you to process your data 4x faster than before, which saves you a ton of time over the course of a release."

By tailoring your demos so you show how to solve your audience's problems, you'll be a much more effective presenter.