Thursday, February 26, 2009

Obfuscation Question

Many companies employ obfuscation to keep their code from being reverse engineered. But this leads to an interesting problem, namely, a GUI that's been obfuscated can't be automated. The only workaround I've been able to come up with for GUI automators is to write & run tests against non-obfuscated code. The problem arises when a manager says "But then we aren't testing the same thing that we ship to our customers"

This got me thinking about how obfuscation impacts other kinds of testing. I would think that obfuscation would impact any tests that were run after the obfuscation process. So is it safe to say that companies who employ unit tests must be running those tests prior to the obfuscator running? If so, should the manager be concerned that the unit tests were run against non-obfuscated code?
Publish Post

Obfuscation would also prevent any other type of post-build white box testing; so any tools I've written that call methods directly from dlls or invoke web service calls would be stopped dead.

So what's the best solution here? Is it to do the bulk of your automated testing against non-obfuscated code and then do manual sanity checks against an obfuscated build? Or is there a better way that I'm not seeing? Please let me know in the comments.

Wednesday, February 25, 2009

Pointers on Pointing

Gestures are a powerful part of any presentation. However, you have to be careful when you point at objects or people to ensure your body language doesn't send the wrong message. Here are a few things I've learned when pointing during presentations:

Don't use a laser. I love giving presentations; I get really excited when I'm in front of people. But that excited energy tends to make my hands shake a little. It's enough that I can't quite focus a laser dot on a single point without it bouncing a bit. If you're really nervous about being in front of people, the dot may dance all over the screen. That's distracting to your audience, and it takes away from the message you're trying to deliver. (Plus we've all seen those comics where the guy with the laser pointer accidentally beheads someone in his audience...)

Point at slides, point at objects, but don't point at people. We're all taught in school that it's not polite to point. Whether you intend it or not, pointing at people implies some measure of accusation. You want to invite your audience in to your message, not make them feel like they're on trial. So instead of pointing at someone, use an open hand with palm up and fingers out. This is a welcoming and encouraging gesture that brings people in rather than putting their backs up.

Use the mouse sparingly. If your mouse is bouncing around your screen throughout the presentation, people are going to get dizzy. Peter Cohen calls this "Pointer Palsy", and it's extremely distracting. Perform your actions slowly, so the mouse isn't racing around the screen. If you are giving a remote demo, you can gesture with the mouse, but use these gestures sparingly. If you don't need to gesture at something, take your hand off the mouse. You'd be surprised how many of us move the mouse around without realizing it or intending to.

Are there other "pointers" you've learned? If so, please share them in the comments.

Monday, February 23, 2009

Notes on Demos

I give lots of demos every day, and I'm always on the lookout for things to make them better/smoother/cleaner. So the next time you have to give a technical presentation, here are some (hopefully) helpful tips:

Turn off all unnecessary applications. You don't want that IM from your significant other coming in during the middle of a discussion. Turn off your email, chat, twitter and any other programs that aren't relevant to your demonstration.

Remove any potentially offensive/confidential material from your recent items. The pics your brother-in-law sent of his latest Vegas exploit may be cool/funny, but they're not professional. On a related note, be professional when naming your computer. During a presentation I sat in on last week, the presenter's laptop was named WideAss. And since the name of his machine was prominently displayed in several places of the app being demonstrated, it's not like you could miss it. That's just not professional, and I couldn't ignore it.

Make it readable. If you're using slides, keep the font big. If you have to shrink the font to get everything to fit, edit, delete, or split it into multiple slides. If you're demo-ing code, keep that big, too. Scott Hanselman suggests using Lucinda Console 14 -18 pt font for all code demonstrations, and I've found that works very well. You lose your rythym as a presenter very quickly when you have to keep asking if everyone can read what's on your slides.

Finally, and most importantly practice! Spend some time saying exactly what you're going to say. Don't go halfway on this one; sitting in front of your slides saying "ok, now I'll talk about X" will not make an effective presentation. Say exactly what you're going to say, exactly how you want to say it. If that means you need to retreat to a conference room or hide in your car at lunch break, then do it. The more familiar you are with what you're going to say, the more confident you'll come across as.

I'll post more demo tips as I can, but what else have you done to make your presentations more effective?

Friday, February 20, 2009

Portable Tools

There are times when your test machines may not have all the tools on them that you need to perform a test effectively. Sometimes you're not allowed to install other apps on a system due to a security concern. Luckily, many applications can be run from a flash drive. This lets you get your job done while not contaminating the system. Here's a quick list of some of my most frequently used portable apps. You can download a full suite of programs from PortableApps.com, and many more from PortableFreeware.com

Notepad++ - lightweight editor for pretty much any type of script.
Recuva - lets you quickly recover deleted files.
Wireshark - packet monitoring tool useful when load/performance testing.
PortablePython - a full python development environment on your flash drive.
FoxitPDF Reader - because sometimes, Acrobat just isn't there.
QueryExpress.exe - simple SQL query analyzer look alike.
WinMerge Portable - lets you quickly perform a diff between two files.

What other portable tools do you use?

Wednesday, February 18, 2009

Tips on Hiring an Automation Engineer

Many QA managers don't know what to look for when hiring an automation engineer. They'll specify some generic criteria (knowledge of C#, knowledge of [insert major commercial tool name here]) and stop at that. Then, when an applicant comes in, the manager asks if they meet those criteria, and the technical side of things stops there. So here are a few tips I'd like to pass along when interviewing a candidate for an automation job.

Be wary of "automation engineers" who only know how to use commercial UI automation tools. Those tools are great, and they have their place, but they are not the end-all be-all of test automation. You also want to make sure they can write their own code outside of the automation tool. For example, many commercial vendors use VBScript as a programming language. If your candidate claims they can write VBScript, ask them to describe how they would perform a scenario outside of the UI tool in that scripting language.

If you already own a UI automation tool, don't hire someone who is an "expert" on a different tool. You're probably thinking that the candidate should be able to pick the other tool up quickly enough, and that may be true. But on a number of occasions I've seen people gripe and complain about how their favorite tool does A, B and C better, and this product is garbage. Some will even try to force a change in the department, to displace the existing tool with their preferred tool. This is disruptive and nothing but a time sink. The time spent arguing about tools is time that should be spent testing. If you've made an investment in a UI automation tool, hire people who are comfortable with that tool.

Describe your development process, and the app being developed. Then ask them where they see opportunities for automation. If they focus exclusively on the UI, then you have someone who won't be as effective as someone who talks about unit tests, who talks about testing an API, or testing the backend database. The key here is that a good automation engineer will look for places throughout the software development lifecycle where automation can be applied.

Ask if they've ever built their own tools, or if they've done any programming for fun. Someone who really enjoys coding will be much more effective than someone who just does it as a day job. Plus, someone who can build their own tools will be able to continue to automate in areas where commerical tool vendors don't. For example if you need to test against a beta operating system, db server or UI element, commerical tool vendors won't support you. Someone who can build their own tools will still be able to create automated tests.

Finally, and this is the most important, do they know how to test? Automation folks are testers too, and you don't want to hire a script monkey. You want to make sure that, as Karen N Johnson put it "...you feel confident that [they] can be handed an application, learn at least its primary aspects at a fairly rapid pace, jump in, join this project and find bugs." You want to make sure that these people can come up with useful scenarios for automation that will add value to your team; you don't want someone who's only able to take a manual test plan and turn it into an automated UI test.

What tips do you have for hiring an automation engineer?

Monday, February 16, 2009

Good Coding Practices

Ben Simo (http://www.questioningsoftware.com/) mentioned via Twitter that he wanted to do a presentation on good automation coding practices. That got me thinking about the things I do to keep my code maintainable, readable and ultimately, useful as projects progress. Here's a quick list of some of my suggested coding practices:

Give functions/methods and variables descriptive names. When you're writing your code, don't name functions arbitrarily, like Test1. Instead, give them descriptive names so it's immediately obvious what the code is going to do. For example, "verifyAdminUserCanDeleteReport" may be a bit wordy, but you know exactly what it's going to do. Similarly, give your variables descriptive names as well. I like to prefix my variables with a shorthand representation of their type; for example, str for string, int for integer. So if I created a variable for a user name, I'd name it strUserName. This helps eliminate confusion, especially when you're dealing with a big test.

Comment everything. The code may seem completely clear to you as you write it, and so commenting may seem like a waste of time. But chances are when you leave it for 6 months to do something else, when you get back to it, you won’t remember what you were doing. Or, when someone else inherits your script, they won’t know what’s going on. Comments provide a nice, easy way to prevent that type of confusion. I like to include information like this with each function I create
/*
Name:

Description:

Syntax:

Parameters:

Return Values:

Example:

*/

You can also use utilities like TwinText to generate help files based on your comments. This provides a nice reference that people can browse for functions that may be useful in their own testing.

Don't chain tests. I see this a lot in UI automation, where "Test B" is dependent on "Test A"’s results. For example, let’s say you are testing a telephone directory application. Test Case A scrolls to the “Howland, Jessica” user record in the phone book and verifies its data. Test Case B clicks the Next button 3 times to get to the “Jameson, Rita” user record, and then verifies its data. If Test Case A fails to reach “Howland, Jessica”, there’s a chance that Test Case B will fail because the number of clicks needed to get to “Jameson, Rita” are dependant upon being at the “Howland, Jessica” record. This is a chain, and could lead to problems with your script. A better approach would be to start at a known point in the database (such as the first record), and use a parameterized loop to click Next until “Jameson, Rita” is displayed in the Name field.

Use source control. This lets you keep a versioned history of your tests, and lets you see who's edited what parts of the tests.

Keep your own backups. The source control server may be backed up, the network drive where you keep your working copies may be backed up, but do yourself a favor and keep your own backups of your tests. External hard drives are cheap, and writing a short script to copy everything from your working directory on to the external drive will only take a minute or two. This can save you a lot of headaches and frustration when servers crash; when your IT guys realize the backup drives are full and haven't succesfuly backed up in a month; or when you get told that your restore task is at the bottom of the todo list.

What good practices would you recommend?

Thursday, February 12, 2009

Tools Do Not Convey Knowledge

Let's say you've decided that you want to change the oil in your car. You don't want to take it to a shop, you want to do it yourself. So you go into Sears, find the Craftsman section, and purchase a mechanic's wrench set, an oil pan, and some jacks. You're all set, right?

One minor issue - you don't know *how* to change the oil; you've never done it before. But these tools were expensive, and they're supposed to make the job easy, so there shouldn't be a problem, right? So now, you get back to your house, crawl under the car and start messing around. After a few hours of frustrated cursing, you emerge from under the car covered in oil, your knuckles are torn up and you've damaged your car so badly that it'll need to be towed. You are all set to call Sears and light into them about what crappy tools they make because you couldn't even do a simple task like change your oil.

While extreme, and perhaps a little ridiculous, this scenario is exactly what happens when a company blindly introduces an automation tool without seeing if its people have the skills needed to use it properly. The statement "Automating tests is easy, you just click a record button" could be equated to "Changing the oil is easy, you just drain the oil and put in fresh stuff."

Make sure that you have a good understanding of what's going to be involved in your automation efforts before you put a tool into play. Do a small pilot project or proof of concept against a piece of your application and see if there are any surprises. That whizzy custom control your developer whipped up may only be accessible to someone who knows C#. That database test you want to perform may only be possible by someone who can write SQL stored procedures. Make sure that your people have those skills before you ask them to dive right in to automation.

Remember, tools enable people to work more effectively, but people need to understand how to use those tools before that can happen.

Wednesday, February 11, 2009

Testers don't break code

It was about 9 am when Carl, our lead developer introduced me to Jesse, a recent college grad and our newest developer. Carl introduced me as "one of the QA team", to which Jesse said "Oh, you're one of the guys who breaks my code." I smiled at him and replied "Actually, the code's already broken by the time I get it." He paused for a moment, then laughed.

This perception of testers has always bothered me. I don't break anyone's code - I try to think of scenarios that the developer may have missed and help him make a better product. If I wanted to be a malicious little gremlin that "broke" code, I'd go into the source files and remove all the semicolons, or replace every instance of the letter e with a q. *That* would be breaking the code.

Testers are there to help developers build a better product. I think the issue arises when people take the issues testers log as personal negative feedback. One of the ways I was able to alleviate that was to do some pairs testing with a developer. We sat down, walked through the AUT, and I talked him through my thought process as I tested. At first he just watched, then started asking to try things as we went along. In short order, he had found issues that I wouldn't have thought to check for. Once he saw that I wasn't being malicious, his whole attitude toward the test team changed.

Have you encountered the "Testers break code" mentality? How did you resolve it?

Monday, February 9, 2009

UI Automation Safety Tip

Here's a safety tip for everyone who works with UI automation record & playback tools. Turn off your email & IM programs before you start recording. Close down your Twitter, GMail and Yahoo-based items as well. This has nothing to do with system resources or anything like that. Rather, it's because there are a surprising number of people out there who will be recording a test, then respond to an IM or decide to jot off an email.

And when they do that, the recorder captures it and puts it into the script, where anyone else who has access to it can see. I had a co-worker once who used to IM his girlfriend during work hours. We sat down one afternoon to review a script he'd been working on, and right in the middle of it was his side of one of their conversations. There was nothing racy or controversial in there, but he was very embarrassed to say the least. It didn't occur to him that the tool was recording *every* action he performed.

So please, for your own sake, turn off those social media utils while you're doing record & playback. (Or at least remember to click the Stop Recording button before you use them.)

Wednesday, February 4, 2009

Incremental Automation

When you're automating, it may seem logical to automate an entire test plan before you begin running it. For example, if you're automating a smoke test, you may want to have all pieces of the smoke test scripted so that you can run the test at the touch of a button. That's a good goal to strive for, but that doesn't mean you can't automate a piece of that smoke test and then start running it right away.

Back when I was first learning to automate, my smoke test was a 26 page test plan that touched all the major features of our product. When I sat down to automate it, I started on page 1 and went step by step. I was doing great until I hit page 11, when a control in my AUT wasn't recognized by my test tool. I spent a couple of days tracking down the issue and coming up with a workaround. A similar problem happened later in the test. All in all, it took about 3 days to automate the test, and another 3 days to troubleshoot issues, so it was over a week before the script ever ran. This was my first experience with automation, so I thought this was probably normal.

Looking back, I can see that by not running the parts of the test that were finished, I lost a great opportunity. I could have had the first 11 pages of that test running while I troubleshot the issues. Then people would have seen that the automation was doing something, and I could've shown my manager that while I was encountering some issues, our first crack at automation was off to a good start. I've heard stories where people tried to automate an entire test, and due to one problem or another, didn't actually run the test until two or three days before their product was set to release. That sort of approach doesn't benefit anyone, and winds up costing time and effort instead of saving it. It also reduces automation's credibility in management's eyes, which is part of the reason (I think, anyway) of why so many automation efforts get abandoned.

So the moral of the story is don't do "big bang automation", where you wait until everything is finished and perfect before you start running your automated tests. Get a piece of your test automated and then start running it. Then continue adding pieces until your entire test or suite of tests are finished. Your team will see results faster from automation, and your management will be more confident that the automation is helping the developing efforts.

Tuesday, February 3, 2009

You Can't Automate User Acceptance Testing

One of the things I hear from clients is "we want our business analysts to be able to automate our user acceptance tests." I cringe every time that phrase gets mentioned. It doesn't work for two reasons.

The first is usually that the analyst does not have the skill set needed to effectively automate a test, and management is usually reluctant to make that kind of investment in them. The second reason is that you can't automate a user acceptance test. You can automate the functionality that makes up that test, make sure the correct dialogs and error messages are displayed at the appropriate time, but the "user acceptance" part only applies if a person with domain knowledge has tried the test and agreed "Yes, this is how the program should work."

I'm afraid that people then get lulled into a false sense of security. "Our automated acceptance tests are passing, so our product meets our users' needs."

What do you think? What have your experiences been with automating user acceptance testing?

Monday, February 2, 2009

Automation Mashers

Test automation is handy for regression testing, but automated regression tests rarely find new bugs. So instead of focusing exclusively on regression tests, try coming up with scenarios to detect bugs that would be difficult (or impossible) to find manually. For example, let's say my AUT can be used to generate reports. I enter some criteria, click "Generate" and the AUT fetches data from a database and dynamically creates a report based on my selections. Let's also say that this fetch and display operation causes a momentary spike in my CPU usage and memory usage before settling back to normal.

Now, we could create a automated regression test that generated a report and verified it generated the correct data. That may be useful, but we can make it much more useful by turning it into a "masher" test. Mashers are tests that perform a single operation over and over again, while tracking how the system performs over time. In doing this, we can find bugs (like memory leaks) that we wouldn't be able to find by manual testing alone.

So in my scenario above, I noticed that there's a momentary spike in CPU and memory use when the report is generated. Being suspicious, I wonder if there's the possibility for a memory leak somewhere in that code. So I'd create an automated test that ran my report and confirmed it was correct, and then loop it so that reports were being generated for 24 hours straight. As the test ran, it would log the system resources being used after each generation. At the end of the test, I'd graph the data that had been logged. If the system resource usage remains constant, then there's no problem. But if there's a steady increase in resource usage, then we have a memory leak, and we've found a bug that a regular regression test (manual or automated) would have missed.

So in addition to making manual testing easier, and taking care of regression tests, don't forget to look for places where automation can help you find bugs that would be impossible to find otherwise.