What's the one thing that consistently sets a successful person apart from the rest? Knowledge. The most successful people I know are always reading, watching industry trends, and keeping their ears to the ground for what's new & exciting.
I subscribe to about 30 different RSS feeds on coding, qa, and sales engineering. A lot of the people I follow on Twitter work in those fields. I read up on what's hot in their fields, and then I can speak more intelligently when I'm asked a question about Visual Studio 2010, agile test automation, or how to give a better presentation. That information helps me do a better job, which keeps me employed, which is a happy thing :)
No matter what your role, whether it's sales, testing, development, or something else, there's always something new for you to learn. Keep learning, and you'll never be bored, you'll keep your skillset current, and you'll be able to do your job more effectively.
Wednesday, September 9, 2009
Friday, September 4, 2009
Helpful Tool: Reflector
Everyone who works with .NET applications should know about Reflector. This is a handy little tool that lets you inspect the code that makes up your application. Need to know what the heck one particular assembly does, and you don't have access to the source code or the developer who wrote the code? Reflector will show you all the methods and classes that live inside the app, what parameters they take, and so on. There are a bunch of plugins available too, that let you view more info about Silverlight apps, get quality metrics, and so on.
Check it out at http://www.red-gate.com/products/reflector/
Check it out at http://www.red-gate.com/products/reflector/
Wednesday, September 2, 2009
Even Robots Need Maintenance
When people are first exposed to test automation, one of the questions I often hear is: "If the application changes, like if they totally reorder & redesign the screens in a given workflow, will my automated test break?"
Well, yes. Your manual test cases will need to be changed as well. Your manual testers may also need to be retrained on your app if it changes that dramatically. The point is that your automated tests are going to need to change and evolve as your application changes and evolves. Anyone who tells you that an automated test is bullet-proof and can't be broken is selling something.
So yes, you will need to refactor, repair and update automated tests as you progress. You can make your tests change resilient, but they'll never be change proof. You'll need to plan for those changes in your release cycles, just as you would any other test activity. Be aware of that, keep it in mind, and your automated test efforts will go much smoother.
What do you do to plan for change in your automated tests? Sound off in the comments.
Well, yes. Your manual test cases will need to be changed as well. Your manual testers may also need to be retrained on your app if it changes that dramatically. The point is that your automated tests are going to need to change and evolve as your application changes and evolves. Anyone who tells you that an automated test is bullet-proof and can't be broken is selling something.
So yes, you will need to refactor, repair and update automated tests as you progress. You can make your tests change resilient, but they'll never be change proof. You'll need to plan for those changes in your release cycles, just as you would any other test activity. Be aware of that, keep it in mind, and your automated test efforts will go much smoother.
What do you do to plan for change in your automated tests? Sound off in the comments.
Monday, August 31, 2009
Helpful Tool: BGInfo
If you have a lot of systems in your test lab, it can be hard to remember what each one has for specs. Enter BGInfo. This handy little app displays all manner of system information as your desktop background, including CPU, RAM, IP Address, System Name, and more. You can also customize it with script to show any other relevant info, like what version of MS Office or Firefox a particular system has.
BGInfo is free to download from this location.
Enjoy!
BGInfo is free to download from this location.
Enjoy!
Friday, August 28, 2009
Helpful Tool: Recuva
"Oh Sh*t!"
We've all heard a co-worker say something like that when they've accidentally deleted a file. Sometimes, it's in the Recycle Bin, but for those unfortunate souls who press Shift+Delete, deleted a file from a flash drive, or who emptied the Recycle Bin without thinking, there's Recuva.
Recuva's a great little tool for recovering deleted files. It's free, takes limited system resources, and can be run from a flash drive. It's a great tool for anyone who troubleshoots computers.
Check it out at http://www.recuva.com/
We've all heard a co-worker say something like that when they've accidentally deleted a file. Sometimes, it's in the Recycle Bin, but for those unfortunate souls who press Shift+Delete, deleted a file from a flash drive, or who emptied the Recycle Bin without thinking, there's Recuva.
Recuva's a great little tool for recovering deleted files. It's free, takes limited system resources, and can be run from a flash drive. It's a great tool for anyone who troubleshoots computers.
Check it out at http://www.recuva.com/
Wednesday, August 26, 2009
Monitor Your Environment
It's important to keep tabs on what's installed on your test systems. Changes in your environment can alter test results and really throw a wrench in your day.
I was running performance tests once. My developer and I had tweaked the systems to their best performance, and the automated tests had been refined. We ran tests before we went home that night, and confirmed our results were good. The next morning, our VP wanted us to run the tests again so he could watch. Imagine our surprise when all our operations were taking 1.5 - 2x longer.
My VP immediately suspected that the test tool was faulty, and that's never a good way to start a discussion. Luckily, I had a written a python app that ran in the background on all my systems. It was a simple little affair, it polled the Uninstall section of the Windows Registry and logged anytime a value was added or removed. During the night, a windows update had been automatically applied, which adversely impacted my system's performance. Without my utility, there's no way I would've caught that.
We spent a couple hours on the phone with Microsoft and got a workaround to the problem. That done, we ran the tests again, and viola, everything was happy. I quickly disabled automatic updates on all my systems from that point on.
I lost a couple of hours to this problem, and my test automation's rep was literally on the line because of an automatic update. So keep tabs on your environment, and make sure you know what's happening to it.
I was running performance tests once. My developer and I had tweaked the systems to their best performance, and the automated tests had been refined. We ran tests before we went home that night, and confirmed our results were good. The next morning, our VP wanted us to run the tests again so he could watch. Imagine our surprise when all our operations were taking 1.5 - 2x longer.
My VP immediately suspected that the test tool was faulty, and that's never a good way to start a discussion. Luckily, I had a written a python app that ran in the background on all my systems. It was a simple little affair, it polled the Uninstall section of the Windows Registry and logged anytime a value was added or removed. During the night, a windows update had been automatically applied, which adversely impacted my system's performance. Without my utility, there's no way I would've caught that.
We spent a couple hours on the phone with Microsoft and got a workaround to the problem. That done, we ran the tests again, and viola, everything was happy. I quickly disabled automatic updates on all my systems from that point on.
I lost a couple of hours to this problem, and my test automation's rep was literally on the line because of an automatic update. So keep tabs on your environment, and make sure you know what's happening to it.
Monday, August 24, 2009
Automation Honors
I've just found out that my blog is a finalist for an Automation Honors award from ATI. ATI is a great resource for automated testing, and I'm honored to have been nominated.
Please head over here and vote for me!
http://www.automatedtestinginstitute.com/home/index.php?option=com_mad4joomla&jid=2&Itemid=137
You can learn more about ATI here: http://www.automatedtestinginstitute.com
Please head over here and vote for me!
http://www.automatedtestinginstitute.com/home/index.php?option=com_mad4joomla&jid=2&Itemid=137
You can learn more about ATI here: http://www.automatedtestinginstitute.com
Test Automation Does Not Replace Testers
Back when automobile companies started putting robots on the assembly line, lots of blue collar workers lost their jobs. The robots were more efficient, faster, and didn't need bathroom breaks. By switching over to an automated process, the auto companies produced more cars faster. And according to them, they were produced cheaper and more efficiently.
Some folks have the same thought about automated testing. "If we automate all our tests, we can lay off half our test team! Think about the cost savings!"
These people are idiots.
See, testing isn't like auto manufacturing. An assembly line worker who's putting wheels on a car does just that. He's not inspecting the wheel for defects, he's not verifying the welds on the axle where the wheel goes, he's just attaching the wheel. There's an inspector later on down the line that checks the work. That assembler's work can be automated with no problem, because he's doing a simple, repetitive, monotonous task.
Testing is different. Yes, there are monotonous regression tests that need to be run, but even when your testers are following a test case, they're still observing beyond what's in the test. If a test step says "Click the OK button to see the login screen" and doing that shows the login screen, that's great. But if clicking that button shows the login screen and also turns the screen bright pink, a tester will log a bug, even though the scripted behavior is correct. Robots don't think beyond what they're told. They can't deduce, reason, or infer. Remember that.
Also, when a robot replaces an assembly line worker, it completely replaces all tasks that worker did. In my example above, the only thing that worker did was put tires on a car. It's highly unlikely that your manual test team only has a handful of test cases. More likely, they're scrambling to make sure the basic functionality test cases are covered. Automating the basic tests will free them up to work on more advanced tasks, which, let me assure you, there is no lack of.
Automation augments your testers, and it lets them work more efficiently. But it should never be viewed as a way to replace the people on your test team.
Some folks have the same thought about automated testing. "If we automate all our tests, we can lay off half our test team! Think about the cost savings!"
These people are idiots.
See, testing isn't like auto manufacturing. An assembly line worker who's putting wheels on a car does just that. He's not inspecting the wheel for defects, he's not verifying the welds on the axle where the wheel goes, he's just attaching the wheel. There's an inspector later on down the line that checks the work. That assembler's work can be automated with no problem, because he's doing a simple, repetitive, monotonous task.
Testing is different. Yes, there are monotonous regression tests that need to be run, but even when your testers are following a test case, they're still observing beyond what's in the test. If a test step says "Click the OK button to see the login screen" and doing that shows the login screen, that's great. But if clicking that button shows the login screen and also turns the screen bright pink, a tester will log a bug, even though the scripted behavior is correct. Robots don't think beyond what they're told. They can't deduce, reason, or infer. Remember that.
Also, when a robot replaces an assembly line worker, it completely replaces all tasks that worker did. In my example above, the only thing that worker did was put tires on a car. It's highly unlikely that your manual test team only has a handful of test cases. More likely, they're scrambling to make sure the basic functionality test cases are covered. Automating the basic tests will free them up to work on more advanced tasks, which, let me assure you, there is no lack of.
Automation augments your testers, and it lets them work more efficiently. But it should never be viewed as a way to replace the people on your test team.
Friday, August 21, 2009
Helpful Tool: Process Monitor
Have you ever wanted to watch and see exactly what files your app was calling, what registry keys it was working with, or what dlls it was loading? Then Process Monitor is for you. This handy little app can tell you everything that's running on your system, or can be filtered down to a specific process. You can download it for free from here.
It's Windows only, so if there are similar tools that you use for Linux, please sound off in the comments.
It's Windows only, so if there are similar tools that you use for Linux, please sound off in the comments.
Wednesday, August 19, 2009
Helpful Tool: WinMerge
Need a tool that can do a diff of those really long reports you just generated? Have two script files that look like they contain the exact same info, but they're behaving differently? You need a diff tool. From their site:
You can download WinMerge from here.
"WinMerge is an Open Source differencing and merging tool for Windows. WinMerge can compare both folders and files, presenting differences in a visual text format that is easy to understand and handle"WinMerge is Windows only, so sound off in the comments if you have a favorite diff tool for OS X or Linux.
You can download WinMerge from here.
Monday, August 17, 2009
Helpful Tool: Wireshark
When you're doing web testing, whether it's functional or performance, it's always a good idea to be able to "see" exactly what's going across the wire. That's where Wireshark comes in. Wireshark is an open source packet monitoring tool that lets you see each individual request that's made, and the response that gets returned. It's available for Windows, OS X & Linux, and can be downloaded for free from here:
www.wireshark.org
www.wireshark.org
Thursday, August 13, 2009
Simple Tests
There's a great episode of Star Trek: The Next Generation where the omnipotent Q is made human. One of the tasks facing the crew during this episode is a moon that's losing orbit and threatening to crash into its planet. Q tries to help Geordi and Data prevent this disaster, and tells them that there's a simple solution. Geordi, excited, and asks what it is. Q's response: "Change the gravitational constant of the universe." Geordi's less than thrilled with this answer to say the least, but that's how Q would have handled the situation.
I see something similar when people are first evaluating automated test tools. They come up with a "simple test" and if the tool fails to perform that test, they immediately write the tool off as useless.
It's true there are times when a given automated tool isn't a good fit for a particular application, but keep perspective here. You, the tester, are on the same level as an empowered Q. You see everything in your application, and know how to make it work. The test tool doesn't have your ability to reason, to identify the cause of problems or to adapt. The test tool doesn't "see" the application the same way you do - it sees your application the way a *computer* sees it. You see a button labeled OK. Your test tool sees an extended Winforms control with a dynamically generated ID like cmdOK87823, or worse, the code may be obfuscated so that a test tool can't read any information from it.
The tool is operating on a much lower level playing field that you are, and it finding that dynamically generated button on an obfuscated application is just as impossible as it would be for Geordi to change the gravitational constant of the universe. So what to do in situations like this? You find ways to make the tool work with your application. In the TNG episode I mentioned, Data & Geordi found a way to send a warp field around the moon that would do something very similar to Q's solution. To find your dynamically generated controls, you could insert wildcards into your test scripts, so the tool would find cmdOK*, and thus match whatever identifier had been dynamically generated. You may need to run your tests against unobfuscated code. Just be aware that tasks like this may be a necessity, and you'll be much better off when it comes time to automate.
I see something similar when people are first evaluating automated test tools. They come up with a "simple test" and if the tool fails to perform that test, they immediately write the tool off as useless.
It's true there are times when a given automated tool isn't a good fit for a particular application, but keep perspective here. You, the tester, are on the same level as an empowered Q. You see everything in your application, and know how to make it work. The test tool doesn't have your ability to reason, to identify the cause of problems or to adapt. The test tool doesn't "see" the application the same way you do - it sees your application the way a *computer* sees it. You see a button labeled OK. Your test tool sees an extended Winforms control with a dynamically generated ID like cmdOK87823, or worse, the code may be obfuscated so that a test tool can't read any information from it.
The tool is operating on a much lower level playing field that you are, and it finding that dynamically generated button on an obfuscated application is just as impossible as it would be for Geordi to change the gravitational constant of the universe. So what to do in situations like this? You find ways to make the tool work with your application. In the TNG episode I mentioned, Data & Geordi found a way to send a warp field around the moon that would do something very similar to Q's solution. To find your dynamically generated controls, you could insert wildcards into your test scripts, so the tool would find cmdOK*, and thus match whatever identifier had been dynamically generated. You may need to run your tests against unobfuscated code. Just be aware that tasks like this may be a necessity, and you'll be much better off when it comes time to automate.
Wednesday, August 12, 2009
Snap Decisions
Occasionally someone will contact me on a Wed afternoon and tell me that they need a demo of my software right away, because they have to make a decision whether to buy it tomorrow. Now, some people are saying this because they think they'll get attention faster, but some people legitimately plan on making a purchase within 24 hours.
These calls scare the daylights out of me. My company provides a 30 day evaluation of our software so that people can try it out and make sure that it works well for them. I can show things off against sample applications, but at the end of the day what matters is that it works with their application. By relying solely on my demo, that second part is completely overlooked.
I liken this to buying a car. Would you walk into a dealership and purchase a new car just based on the commercial you'd seen? No, you want to take it for a test drive. You want to know how well it corners, how loud the engine is, how comfortable the seats are.
So if you have someone on your team who's pushing for a snap decision on a product, do everything you can to keep that from happening. Make sure some time has been spent ensuring that the program meets your needs. Otherwise you could end up in a bad spot where a tool has been purchased, it doesn't work, and now you have no money to get something else in place.
These calls scare the daylights out of me. My company provides a 30 day evaluation of our software so that people can try it out and make sure that it works well for them. I can show things off against sample applications, but at the end of the day what matters is that it works with their application. By relying solely on my demo, that second part is completely overlooked.
I liken this to buying a car. Would you walk into a dealership and purchase a new car just based on the commercial you'd seen? No, you want to take it for a test drive. You want to know how well it corners, how loud the engine is, how comfortable the seats are.
So if you have someone on your team who's pushing for a snap decision on a product, do everything you can to keep that from happening. Make sure some time has been spent ensuring that the program meets your needs. Otherwise you could end up in a bad spot where a tool has been purchased, it doesn't work, and now you have no money to get something else in place.
Monday, August 10, 2009
Automating Installs
A common scenario in a manual test case is something like this:
1 - Uninstall old version of application
2 - Install new build
3 - [Perform actual test here]
When people are starting out with test automation tools, they often want to automate steps 1 & 2. This makes complete sense, but the approach taken is almost always the wrong one. I've seen many people try to use record & playback tools to open the control panel, click Add/Remove programs, and uninstall.
Now, conceptually, this shouldn't be a big deal, but when you figure that there are differences in the control panel in almost every version of Windows, your recorded script will break quite easily. So instead, a better option is to use command line flags to remove your application.
Almost all the major install building programs allow for the creation of command line parameters. This lets you install your app with a command like "myapp.exe /AcceptLicenseAgreement /InstallToDefaultLocation" They also usually have uninstall commands as well. Installing/uninstalling your app in this fashion makes it a lot easier to get new versions of your programs loaded for use with automated tests.
Talk with your build engineer to get a list of the commands available for your product, and if there aren't any, work with him or her to get some implemented. It will only make your life easier in the long run.
1 - Uninstall old version of application
2 - Install new build
3 - [Perform actual test here]
When people are starting out with test automation tools, they often want to automate steps 1 & 2. This makes complete sense, but the approach taken is almost always the wrong one. I've seen many people try to use record & playback tools to open the control panel, click Add/Remove programs, and uninstall.
Now, conceptually, this shouldn't be a big deal, but when you figure that there are differences in the control panel in almost every version of Windows, your recorded script will break quite easily. So instead, a better option is to use command line flags to remove your application.
Almost all the major install building programs allow for the creation of command line parameters. This lets you install your app with a command like "myapp.exe /AcceptLicenseAgreement /InstallToDefaultLocation" They also usually have uninstall commands as well. Installing/uninstalling your app in this fashion makes it a lot easier to get new versions of your programs loaded for use with automated tests.
Talk with your build engineer to get a list of the commands available for your product, and if there aren't any, work with him or her to get some implemented. It will only make your life easier in the long run.
Friday, August 7, 2009
LMGTFY
We've all had those moments where someone emails us a question that could've been answered with a simple Google search. Things like "What does IIS stand for?" "How many GB in a TB?" "What's the capital of Iceland?" (ok, I've never actually been asked that one, but you know what I mean).
For those moments when you're feeling just a little snarky at someone for throwing one of these questions your way, there's Let Me Google That For You. LMGTFY takes a question and creates a URL that you can email in response to the question. When your associate clicks the link, they'll see a vid of the question being typed into the Google home page and the Google Search button being clicked, followed by a message saying "Was that so hard?" Then the user is taken to the actual results of the search.
Here's a sample with the aforementioned "What's the capital of Iceland?" question. Enjoy!
For those moments when you're feeling just a little snarky at someone for throwing one of these questions your way, there's Let Me Google That For You. LMGTFY takes a question and creates a URL that you can email in response to the question. When your associate clicks the link, they'll see a vid of the question being typed into the Google home page and the Google Search button being clicked, followed by a message saying "Was that so hard?" Then the user is taken to the actual results of the search.
Here's a sample with the aforementioned "What's the capital of Iceland?" question. Enjoy!
Wednesday, August 5, 2009
Blind Faith
Your automated tests are great. They run unattended every night, and each morning, you come in and see a little report on your desktop with a bunch of green success messages. You feel good. You know those tests are making sure the basic functionality of your app is running smooth, and there hasn't been a failure in days.
Then you give the first build of the app to the test team. Within an hour, 30 high priority bugs have been logged. You review the bugs and see that close to half of them are scenarios that are covered by the automated tests. Stupid testers, you think. They must be doing something wrong. You kick off your automated suite, and it passes. Then you try to run the same test by hand, and it fails. You try it again by hand, and it fails again.
Then you take a look at the automated test, and realize there's a flaw in its logic. You see that the test is configured to always pass, regardless of what's actually happening. You quickly dive into the code and fix the application, and then you go back and fix the tests.
The moral of the story here is to make sure you know exactly what your automated tests are doing. Have them code reviewed just like any other part of your project. Just because a report gets spit out saying everything passed, doesn't mean everything did.
Then you give the first build of the app to the test team. Within an hour, 30 high priority bugs have been logged. You review the bugs and see that close to half of them are scenarios that are covered by the automated tests. Stupid testers, you think. They must be doing something wrong. You kick off your automated suite, and it passes. Then you try to run the same test by hand, and it fails. You try it again by hand, and it fails again.
Then you take a look at the automated test, and realize there's a flaw in its logic. You see that the test is configured to always pass, regardless of what's actually happening. You quickly dive into the code and fix the application, and then you go back and fix the tests.
The moral of the story here is to make sure you know exactly what your automated tests are doing. Have them code reviewed just like any other part of your project. Just because a report gets spit out saying everything passed, doesn't mean everything did.
Monday, August 3, 2009
Ninjas or Pirates? Both.
There's an age old debate on who would win in a fight, ninjas or pirates. But as I once saw in a Nodwick comic, if you create Ninja Pirates, then you've got something absolutely unstoppable.
For some reason, this got me thinking about when I hear people argue about whether it's worth it to exclusively do automated testing, or exclusive manual testing. On their own, both techniques are powerful, but you're going to find the most bugs when you combine the two techniques. So augment your manual tests with automated tools that will let your testers do things faster and more efficiently. The end result will be well worth it.
Plus, you can brag to your friends that you've got a bunch of Ninja Pirates working for you.
For some reason, this got me thinking about when I hear people argue about whether it's worth it to exclusively do automated testing, or exclusive manual testing. On their own, both techniques are powerful, but you're going to find the most bugs when you combine the two techniques. So augment your manual tests with automated tools that will let your testers do things faster and more efficiently. The end result will be well worth it.
Plus, you can brag to your friends that you've got a bunch of Ninja Pirates working for you.
Monday, July 13, 2009
More than One Right Way
In testing, we often focus on there being one right way to do something. This may be known as the "happy path" where you know a particular test scenario works when all the input parameters are exactly right. Then we charge off the happy path into the dark unknown, where we expect to find bugs. But as this Lifehacker post reminded me, sometimes there's more than one right way to do something.
This video (http://lifehacker.com/5311002/open-a-banana-like-a-monkey) shows how to open a banana from the bottom. This was a bit of a shock to me, as I've been using the stem of a banana like a pull tab my whole life. It's actually easier & cleaner to do the method shown in the vid. Either approach works, but they're both right.
When you're creating your test plans, take a few minutes and think about how many happy paths may exist. They'll help you create more thorough test cases and help you uncover more interesting bugs.
This video (http://lifehacker.com/5311002/open-a-banana-like-a-monkey) shows how to open a banana from the bottom. This was a bit of a shock to me, as I've been using the stem of a banana like a pull tab my whole life. It's actually easier & cleaner to do the method shown in the vid. Either approach works, but they're both right.
When you're creating your test plans, take a few minutes and think about how many happy paths may exist. They'll help you create more thorough test cases and help you uncover more interesting bugs.
Wednesday, July 1, 2009
What goes down, must come up
I have to admit, I'm a soda junkie. Pepsi's my preferred beverage, but I love Jolt, Coke, Mountain Dew (especially the World of Warcraft varieties), root beer and a bunch more. But I've learned the hard way that it's a *very bad* idea to have a Pepsi at arm's reach while doing a demo. The reason? Carbonation makes you burp , and depending on how much you drank, you may be burping a lot. A second issue is that odd 'quasi burp', where your throat constricts because you can feel a burp coming, but it doesn't. Either of these makes you look less than professional, and you don't want to have various bodily noises distracting people from the message you're trying to deliver.
So, avoid the carbonated beverages for about 30 mins before you give a presentation. Have a bottle of room temperature water handy in case you need a drink. (If the water's too cold, it can constrict your throat, and that's a whole different problem. )
Once you've given your killer, burp-free presentation, crack open a Pepsi for me!
So, avoid the carbonated beverages for about 30 mins before you give a presentation. Have a bottle of room temperature water handy in case you need a drink. (If the water's too cold, it can constrict your throat, and that's a whole different problem. )
Once you've given your killer, burp-free presentation, crack open a Pepsi for me!
Monday, June 22, 2009
Pull Up
Back when I was first working in automation, my AUT was a java based course player. Users could learn about how to be a good manager, how to shake hands in Australia, and a myriad of other soft skill topics. At the end of each learning session, the user would be given a 10 question multiple choice quiz on what they learned.
My job was to have my automation tool launch courses, answer questions, and confirm that the scoring mechanism in the app was behaving properly.
What made this particularly challenging to automate was that the questions were displayed in random order, and there were anywhere from 30 - 50 potential questions per quiz. So you never knew which questions were coming, or what order they were coming in.
I voiced my concern to my team lead, and she said that I should maintain a list of every question that could be listed in the course, and the answers to those questions. Now, that might have been ok if I were only dealing with one course, but there were dozens of courses I had to work with to ensure the app was behaving properly.
One of the things I learned early on was to "pull up," which is my way of saying take a breath, step back, and look at the whole picture. Is there a better way to handle this? What I realized was that somehow, the AUT knew what the right answer to any question was. If Choice C was right, then the AUT knew that. It was so blindingly obvious that I was ticked at myself for not seeing it sooner.
I went over to one of my developers and spent the next 10 minutes discussing my situation, my intended solution, and what was required to implement it. By the end of that afternoon, I had a method in place that would call out to the AUT and tell me which answer was correct for any given question.
It was a heck of a lot cleaner than maintaining a question/answer list, and it made for good "press" internally with automation. It also raised my developer street-cred, because folks saw that test automation was a lot more than just point&click record and playback.
So when you face a challenge in your automation, remember to pull up and make sure that you're not missing something obvious.
My job was to have my automation tool launch courses, answer questions, and confirm that the scoring mechanism in the app was behaving properly.
What made this particularly challenging to automate was that the questions were displayed in random order, and there were anywhere from 30 - 50 potential questions per quiz. So you never knew which questions were coming, or what order they were coming in.
I voiced my concern to my team lead, and she said that I should maintain a list of every question that could be listed in the course, and the answers to those questions. Now, that might have been ok if I were only dealing with one course, but there were dozens of courses I had to work with to ensure the app was behaving properly.
One of the things I learned early on was to "pull up," which is my way of saying take a breath, step back, and look at the whole picture. Is there a better way to handle this? What I realized was that somehow, the AUT knew what the right answer to any question was. If Choice C was right, then the AUT knew that. It was so blindingly obvious that I was ticked at myself for not seeing it sooner.
I went over to one of my developers and spent the next 10 minutes discussing my situation, my intended solution, and what was required to implement it. By the end of that afternoon, I had a method in place that would call out to the AUT and tell me which answer was correct for any given question.
It was a heck of a lot cleaner than maintaining a question/answer list, and it made for good "press" internally with automation. It also raised my developer street-cred, because folks saw that test automation was a lot more than just point&click record and playback.
So when you face a challenge in your automation, remember to pull up and make sure that you're not missing something obvious.
Thursday, June 11, 2009
The Kindle
My family bought me a Kindle 2 for my birthday a couple months ago, and I am totally enamored with the device. I really do forget that I'm not reading a physical book. The electronic ink the Kindle uses is crisp, and very easy on the eyes. The only downside is how easy it's become to drop a ton of money at Amazon.
Here's a brief sampling of some of the titles I've picked up so far:
The Prince. This book should be in everyone's library. Machiavelli has a bad rep, but this book is extremely insightful. I have a paperback copy, but it's getting pretty tattered.
UR. I'm a big Stephen King fan, and this novella is a story about the Kindle. He ties it in nicely with the Dark Tower and From a Buick 8.
Agile Testing - A Practical Guide For Testers and Agile Teams. This is a great overview of agile testing, and what a tester's role is in an agile environment.
The Little Red Book of Selling - It's marketed as a sales book, but this little gem can be applied to achieving pretty much any goal you may have.
Goblin Hero - If you've ever read epic or heroic fantasy, you owe it to yourself to pick up the Goblin books by Jim C Hines. They give you a look at what it's like from the Goblin's point of view, and they're really funny.
I'll post more in-depth reviews as time goes on. What are you reading these days?
Here's a brief sampling of some of the titles I've picked up so far:
The Prince. This book should be in everyone's library. Machiavelli has a bad rep, but this book is extremely insightful. I have a paperback copy, but it's getting pretty tattered.
UR. I'm a big Stephen King fan, and this novella is a story about the Kindle. He ties it in nicely with the Dark Tower and From a Buick 8.
Agile Testing - A Practical Guide For Testers and Agile Teams. This is a great overview of agile testing, and what a tester's role is in an agile environment.
The Little Red Book of Selling - It's marketed as a sales book, but this little gem can be applied to achieving pretty much any goal you may have.
Goblin Hero - If you've ever read epic or heroic fantasy, you owe it to yourself to pick up the Goblin books by Jim C Hines. They give you a look at what it's like from the Goblin's point of view, and they're really funny.
I'll post more in-depth reviews as time goes on. What are you reading these days?
Wednesday, June 3, 2009
State
One thing to remember with UI automation is that your application needs to be in the same state each time the test starts. You need to be on the right screen, with the right options enabled, and your test goes from there. This makes perfect sense to me, but I'm amazed at the people who get frustrated with that. They say "The tool should just know."
Now I agree that it's good to build logic into your tests that can get you onto the right screen if you aren't there already. It's also good to build recovery scenarios into your tests so you can get back to a good state if something goes wrong. However, to assume that the tool is just going to know what to do is ludicrous. The computer is a stupid box. You are the one with the brains. You tell the box what to do and it does it.
Come on, do you really want Skynet running your test efforts?
Now I agree that it's good to build logic into your tests that can get you onto the right screen if you aren't there already. It's also good to build recovery scenarios into your tests so you can get back to a good state if something goes wrong. However, to assume that the tool is just going to know what to do is ludicrous. The computer is a stupid box. You are the one with the brains. You tell the box what to do and it does it.
Come on, do you really want Skynet running your test efforts?
Wednesday, May 13, 2009
Stupid Password Masking
Log In/Out tests are commonplace throughout software testing. Pretty much every portal, client/server app, or web site I've worked with has required that I log in with a valid user/pass combo. Those credentials were always listed in the test plans I used. "Step1 - Log in as User1/Pass123 and click Enter."
Now, there's no problem with putting the user/pass combo in the test plan, and then tacking it up on the wall. But why do the managers, whose teams are proudly displaying test user credentials alongside photos of their kids, suddenly balk if an automated tool stores passwords in plain text format?
"Someone else could learn the password" they cry. "This is horrible and insecure!"
OK, let's think about this, for a moment. The tests will be placed in source control. Source control can restrict who can access the tests. Boom - problem solved. Let's look at it from another angle. Let's say that your automated tool masked the password that it entered, so that it just appeared as asterisks. How do you know what password is being entered? If you login is failing over and over, is it because the tool is entering a bum password? You'll never know.
Maybe I'm missing something here. But as a tester, I want to know what password my tests are using. Toggling a "hide password" feature on and off seems foolish too, as I can just toggle it to "show password" and bang - there it is again.
What do you think? Is there something really obvious that I'm missing here?
Now, there's no problem with putting the user/pass combo in the test plan, and then tacking it up on the wall. But why do the managers, whose teams are proudly displaying test user credentials alongside photos of their kids, suddenly balk if an automated tool stores passwords in plain text format?
"Someone else could learn the password" they cry. "This is horrible and insecure!"
OK, let's think about this, for a moment. The tests will be placed in source control. Source control can restrict who can access the tests. Boom - problem solved. Let's look at it from another angle. Let's say that your automated tool masked the password that it entered, so that it just appeared as asterisks. How do you know what password is being entered? If you login is failing over and over, is it because the tool is entering a bum password? You'll never know.
Maybe I'm missing something here. But as a tester, I want to know what password my tests are using. Toggling a "hide password" feature on and off seems foolish too, as I can just toggle it to "show password" and bang - there it is again.
What do you think? Is there something really obvious that I'm missing here?
Wednesday, May 6, 2009
Clean Up Your Screen
When I watch demos, I always find myself distracted from the presenter's message by something on their desktop. Maybe there's an icon in the system tray I haven't seen before (or maybe something down there is blinking). Maybe they have a shortcut to World of Warcraft on their desktop, or maybe their desktop is nothing but shortcuts, giving the feeling that their Start Menu just threw up all over the screen.
I've whittled my desktop down to the Recycle bin, and 3 shortcuts. One is the application I demo with, and the other 2 are apps I use during my demonstration. Everything that the audience sees has a point to being there.
As for the System Tray (or notification area, if you prefer), I recently discovered there's a way to force Windows to hide icons that are displayed there. Right click on the tray and select Properties, then click Customize on the Taskbar tab. You'll see a list of the items in your tray, and you can select if you want them to always be hidden, always shown, or only hidden if inactive. I set everything except my system's volume control and GoToMeeting to Always Hide.
It's a small thing, but it helps keep people focused on the message I'm sending, rather than wondering if I'm playing Horde or Alliance.
I've whittled my desktop down to the Recycle bin, and 3 shortcuts. One is the application I demo with, and the other 2 are apps I use during my demonstration. Everything that the audience sees has a point to being there.
As for the System Tray (or notification area, if you prefer), I recently discovered there's a way to force Windows to hide icons that are displayed there. Right click on the tray and select Properties, then click Customize on the Taskbar tab. You'll see a list of the items in your tray, and you can select if you want them to always be hidden, always shown, or only hidden if inactive. I set everything except my system's volume control and GoToMeeting to Always Hide.
It's a small thing, but it helps keep people focused on the message I'm sending, rather than wondering if I'm playing Horde or Alliance.
Monday, May 4, 2009
Machiavellian Automation
I spoke with a colleague recently about a company whose automated test efforts had suddenly stopped. They had been using a third party resource for all their automated testing, and the money for that resource had dried up. As such, automated test development stopped, and no one in house had the knowledge or expertise to pick up the effort.
This reminded me of something I'd read in Machiavelli's The Prince. Machiavelli argued that princes should not rely on mercenaries or auxiliaries, and instead should rely on their own people. Some of that applies here as well. I think it's great if you want to jump start or augment your efforts by bringing in third party resources, but you need to be building up your own people as well. Sooner or later, that outside resource will be unavailable, either due to funding on your part, or lack of time on theirs. If you've built up your own team so that they can take the effort over, you'll have a smooth transition and no lost time or effort. If you haven't, your people will be scrambling to figure out how the automation works, and you'll waste time and money.
So make the investment in your people. Send them to training, buy them books, encourage them to learn. The end result will be in house expertise, which will make all your efforts smoother in the long run.
This reminded me of something I'd read in Machiavelli's The Prince. Machiavelli argued that princes should not rely on mercenaries or auxiliaries, and instead should rely on their own people. Some of that applies here as well. I think it's great if you want to jump start or augment your efforts by bringing in third party resources, but you need to be building up your own people as well. Sooner or later, that outside resource will be unavailable, either due to funding on your part, or lack of time on theirs. If you've built up your own team so that they can take the effort over, you'll have a smooth transition and no lost time or effort. If you haven't, your people will be scrambling to figure out how the automation works, and you'll waste time and money.
So make the investment in your people. Send them to training, buy them books, encourage them to learn. The end result will be in house expertise, which will make all your efforts smoother in the long run.
Friday, May 1, 2009
Only Bitmap Comparisons? Bad Tester! Bad!
Many automation tools have the ability to take a screenshot of something in your AUT, and compare that to a baselined value. The intention is that you can use this to verify that images have rendered properly in your app; so things like your company logo, a product's photo, or the like can be validated easily. But I still come across people who want to use bitmap comparisons as their only means of validation in their tests.
Their reasoning seems sound enough. "If we just click through each screen of our app, and do a bitmap comparison of each full screen, then we can easily confirm the app is working. The bitmap comparison will automatically validate every value in every field, so we'll be able to create our tests even faster!"
It sounds good, doesn't it? But it doesn't work that way in practice. In fact, this has to be one of the most fragile, breakable, and ineffective approaches to automation. See, bitmap comparisons are going to do a pixel by pixel comparison of each screen. That means that if a single pixel is off, the test fails. In some cases, just running the same test on a different OS is enough to cause every bitmap comparison to fail, because a dialog in WinXP may have a screen where one pixel is slightly lighter than that same pixel in Vista. To your eyes, they don't look any different, so you won't know there's a problem until it's too late. Imagine that suddenly all your tests fail, and for no good reason. If this happens, you'll wind up in a maintenance nightmare as you struggle to track down that rogue pixel and somehow convince your automation tool that this is really ok. Automation is supposed to make testing easier and faster, not the other way around.
So please, take my advice. Use bitmap/region/image (or whatever your automated tool calls them) comparisons only to compare actual images. If you want to verify the contents of a field, read the actual value from that field. If you want to verify a button is enabled, check its actual Enabled property. This may require a bit more effort up front, but the payoff in the long run is well worth it. You'll have more efficient, more stable tests that will run clean, and it won't matter if that one rogue pixel off to the left is grey, a slightly darker shade of gray or a slightly lighter shade of gray than the ones around it.
Their reasoning seems sound enough. "If we just click through each screen of our app, and do a bitmap comparison of each full screen, then we can easily confirm the app is working. The bitmap comparison will automatically validate every value in every field, so we'll be able to create our tests even faster!"
It sounds good, doesn't it? But it doesn't work that way in practice. In fact, this has to be one of the most fragile, breakable, and ineffective approaches to automation. See, bitmap comparisons are going to do a pixel by pixel comparison of each screen. That means that if a single pixel is off, the test fails. In some cases, just running the same test on a different OS is enough to cause every bitmap comparison to fail, because a dialog in WinXP may have a screen where one pixel is slightly lighter than that same pixel in Vista. To your eyes, they don't look any different, so you won't know there's a problem until it's too late. Imagine that suddenly all your tests fail, and for no good reason. If this happens, you'll wind up in a maintenance nightmare as you struggle to track down that rogue pixel and somehow convince your automation tool that this is really ok. Automation is supposed to make testing easier and faster, not the other way around.
So please, take my advice. Use bitmap/region/image (or whatever your automated tool calls them) comparisons only to compare actual images. If you want to verify the contents of a field, read the actual value from that field. If you want to verify a button is enabled, check its actual Enabled property. This may require a bit more effort up front, but the payoff in the long run is well worth it. You'll have more efficient, more stable tests that will run clean, and it won't matter if that one rogue pixel off to the left is grey, a slightly darker shade of gray or a slightly lighter shade of gray than the ones around it.
Wednesday, April 29, 2009
Befuddled
I've posted previously about being honest when you deal with prospective customers. When a customer describes a scenario to me that I know my company's product isn't a good fit for, I state that. The last thing I want is for people to think I've lead them on so we can make a sale. People won't want to do repeat business with us, and hurts both my reputation and the company's.
But I've encountered one particular scenario that I just can't wrap my head around. I was speaking with a test manager and his team. They outlined their scenario, and what they'd done with my product so far, and the issues they had encountered. I listened carefully, and based on what they wanted to achieve, I knew my app wasn't a good fit. I told them this, and stated my reasons why. Several of the team members voiced their agreement. And then the manager said something that flat out stunned me.
"I think we can make it work. We'll buy it."
I couldn't think of anything to say other than restate my previous concerns. The manager told me that his team would deal with it, and that they would purchase.
How would you handle this?
But I've encountered one particular scenario that I just can't wrap my head around. I was speaking with a test manager and his team. They outlined their scenario, and what they'd done with my product so far, and the issues they had encountered. I listened carefully, and based on what they wanted to achieve, I knew my app wasn't a good fit. I told them this, and stated my reasons why. Several of the team members voiced their agreement. And then the manager said something that flat out stunned me.
"I think we can make it work. We'll buy it."
I couldn't think of anything to say other than restate my previous concerns. The manager told me that his team would deal with it, and that they would purchase.
How would you handle this?
Thursday, April 23, 2009
I Can Haz Demo?
It's funny how social media has changed our perception of language. We communicate much more informally via the emoticons and acronyms that we use in IM programs, Twitter and in-game chat programs. However, when communicating with first time customers, clients and prospects, this abbreivated means of communication strikes me as unacceptable. You're trying to portray yourself as professional, not show how hip you are. Additionally, if you are a customer requesting help, a demonstration, or requesting license information, your sales rep probably won't take you very seriously if you send in a message like the one I received last summer:
"Can u plz help in it?bcoz it is not clear from Demo how to test Web application. Waiting 4 ur reply."
While this person may have been genuinely interested in my company's product, this communication came across as unprofessional. My sales team sent off a canned email response along with links to some recorded demos. If this person had listed their questions in plain English, we definitely would've put a lot more effort into helping them.
I don't intend to come across as snobbish in this post. I use IM-speak a lot myself, and I abbreviate words when I'm on Twitter. But when I'm trying to solve a customer's problem, I'm not out to show what a l33t hax0r I am. And in the same vein, when someone approaches me with a problem to solve, I definitely prefer it to come in clear and concise language.
Am I being too old fashioned here? What do you folks think?
"Can u plz help in it?bcoz it is not clear from Demo how to test Web application. Waiting 4 ur reply."
While this person may have been genuinely interested in my company's product, this communication came across as unprofessional. My sales team sent off a canned email response along with links to some recorded demos. If this person had listed their questions in plain English, we definitely would've put a lot more effort into helping them.
I don't intend to come across as snobbish in this post. I use IM-speak a lot myself, and I abbreviate words when I'm on Twitter. But when I'm trying to solve a customer's problem, I'm not out to show what a l33t hax0r I am. And in the same vein, when someone approaches me with a problem to solve, I definitely prefer it to come in clear and concise language.
Am I being too old fashioned here? What do you folks think?
Honesty Is the Best Policy
My dad used to have a plaque that read "If you can't dazzle 'em with brilliance, baffle 'em with bullshit." I saw this employed quite frequently when I was evaluating test tools.
When I would setup demos from tool vendors, I was always up front about my needs and what I was looking for. Then I'd watch whatever they prepared and ask any questions that sprang to mind. Sometimes those questions would stump the presenter, and he or she would come up with some answer that was choc full of buzzwords that didn't mean anything. I never felt the need to call people on this - because in doing this, they answered a different question, which is how honest they are. I tend to avoid doing business with people who BS me, because I never know if I can trust what they're saying. It's hard to get work done when a trust issue is always nagging in the back of your head.
I have no problem with someone telling me "I don't know", so long as that is immediately followed by "But I'll find out and get back to you." I appreciate the honesty and I respect the person more as a result of it, and I think many people share that sentiment. I would much rather work with someone who didn't know an answer and found out for me, rather than someone who'd invented something that sounded good at the time, in order to turn a quick sale.
Now that I'm the one giving the demos, I always try to prepare well in advance so I can address any questions my client may have. And when they stump me, I don't follow the advice on dad's plaque.
When I would setup demos from tool vendors, I was always up front about my needs and what I was looking for. Then I'd watch whatever they prepared and ask any questions that sprang to mind. Sometimes those questions would stump the presenter, and he or she would come up with some answer that was choc full of buzzwords that didn't mean anything. I never felt the need to call people on this - because in doing this, they answered a different question, which is how honest they are. I tend to avoid doing business with people who BS me, because I never know if I can trust what they're saying. It's hard to get work done when a trust issue is always nagging in the back of your head.
I have no problem with someone telling me "I don't know", so long as that is immediately followed by "But I'll find out and get back to you." I appreciate the honesty and I respect the person more as a result of it, and I think many people share that sentiment. I would much rather work with someone who didn't know an answer and found out for me, rather than someone who'd invented something that sounded good at the time, in order to turn a quick sale.
Now that I'm the one giving the demos, I always try to prepare well in advance so I can address any questions my client may have. And when they stump me, I don't follow the advice on dad's plaque.
Monday, April 20, 2009
Screenshots Need Accompanying Info
It's said that a picture is worth a thousand words, and quite frequently, testers attach screenshots to their bug reports to illustrate a particular issue. While this can provide additional information that's helpful to a developer, I've noticed a lot of people lately who just log screenshots as bugs, with no steps to reproduce. For example, I recently received an email saying "The tool doesn't work - see attached" The attached bitmap showed the application with a blank screen; there were no error messages, no data, nothing that told me what didn't work, nothing that told me how it had gotten into this state. The back-and-forth email chain that ensued could've been avoided completely if the person had given me the steps they were performing and what they were trying to achieve.
To encourage people to provide more information in their bug reports, try this at your next team meeting. Say "My car wouldn't start this morning. Why?" Before anyone can answer, quickly add "Oh, and here's a screenshot of it" and have a photo of your car. Hopefully this drives the point home.
To encourage people to provide more information in their bug reports, try this at your next team meeting. Say "My car wouldn't start this morning. Why?" Before anyone can answer, quickly add "Oh, and here's a screenshot of it" and have a photo of your car. Hopefully this drives the point home.
Friday, April 10, 2009
You Have Control
How many times have you walked out of a meeting, grinding your teeth because someone said something that upset you? I've seen this scenario happen a lot over the course of my career, and it usually ends up with the upset person venting to a friend or co-worker who had nothing to do with the situation. Nothing ever changes between the upset-ee and the upset-er, because they don't communicate with each other.
I experienced this earlier this week. I left a meeting and was steaming mad at one particular participant. While this person had valuable things to say, they were rude and condescending, and their behavior really insulted me. I ranted to my family for a while, but it didn't help. So I sat down and wrote a very polite email to this person. I explained that I valued their input, and that I wanted to work with them, but that they had been condescending.
I didn't demand an apology, I didn't use any harsh language. I sincerely thanked them for their comments (which I really did appreciate), and stated my issue with their behavior. Once I clicked the Send button for that email, it felt like a huge weight lifted off me, and my mood improved dramatically.
So remember, you can't control other people, but you can control yourself. I tried to handle the situation with diplomacy, and while the jury's still out on what the outcome will be, I felt better because I actively did something. So next time someone gets you riled up during a meeting, step back and think about what you can do to take a measure of control in the situation.
I experienced this earlier this week. I left a meeting and was steaming mad at one particular participant. While this person had valuable things to say, they were rude and condescending, and their behavior really insulted me. I ranted to my family for a while, but it didn't help. So I sat down and wrote a very polite email to this person. I explained that I valued their input, and that I wanted to work with them, but that they had been condescending.
I didn't demand an apology, I didn't use any harsh language. I sincerely thanked them for their comments (which I really did appreciate), and stated my issue with their behavior. Once I clicked the Send button for that email, it felt like a huge weight lifted off me, and my mood improved dramatically.
So remember, you can't control other people, but you can control yourself. I tried to handle the situation with diplomacy, and while the jury's still out on what the outcome will be, I felt better because I actively did something. So next time someone gets you riled up during a meeting, step back and think about what you can do to take a measure of control in the situation.
Thursday, April 9, 2009
Free Presentation Helper Tools
When I'm giving presentations & demos, there are a handful of tools that make things go much smoother. The best part is that all these tools are free.
ZoomIt - this is a magnification and annotation tool. I use it to zoom in on areas of interest within my application, and to draw callout arrows on various bits of the app I'm demonstrating.
Keyboard Jedi - this app displays any keyboard shortcuts you've pressed. For example, if you press Ctrl+Space in Visual Studio to open a code completion window, KeyboardJedi displayes Ctrl+Space. This keeps you from needing to repeat a bunch of keyboard commands to your audience as you're going through your presentation.
CoolTimer - This is a simple timer application. I set it up as a count down timer to show when a presentation will begin. This is nice because my audience knows exactly when things will be starting.
QRes - This is a command line application that changes your screen resolution. I have a batch file set up to change my resolution from high - 1680 x 1050 to something easier to read (1280x1024). I just run this batch file before my presentation and I don't need to worry about monkeying around with display settings.
ZoomIt - this is a magnification and annotation tool. I use it to zoom in on areas of interest within my application, and to draw callout arrows on various bits of the app I'm demonstrating.
Keyboard Jedi - this app displays any keyboard shortcuts you've pressed. For example, if you press Ctrl+Space in Visual Studio to open a code completion window, KeyboardJedi displayes Ctrl+Space. This keeps you from needing to repeat a bunch of keyboard commands to your audience as you're going through your presentation.
CoolTimer - This is a simple timer application. I set it up as a count down timer to show when a presentation will begin. This is nice because my audience knows exactly when things will be starting.
QRes - This is a command line application that changes your screen resolution. I have a batch file set up to change my resolution from high - 1680 x 1050 to something easier to read (1280x1024). I just run this batch file before my presentation and I don't need to worry about monkeying around with display settings.
Wednesday, April 8, 2009
Going Virtual
Virtual machines are a great way to try out new apps without fear of completely hosing your system. I frequently use them both when I do testing, and when I give demonstrations. I use both VMWare and VirtualPC, depending on the scenario I'm working on.
One thing I've found is that Microsoft has several prebuilt virtual machines that show off some of their applications. The Visual Studio 2010 CTP is available as a VM, and there's a complete working TFS system as well.
For demonstration purposes, these are perfect because they're already set up and configured properly. That means you can spend your time on developing a worthwhile presentation for your client, rather than monkeying around with setting a system up.
So before you build a VM, run a quick search on Google to see if there's a pre-built one out there. It can save you lots of time and effort.
One thing I've found is that Microsoft has several prebuilt virtual machines that show off some of their applications. The Visual Studio 2010 CTP is available as a VM, and there's a complete working TFS system as well.
For demonstration purposes, these are perfect because they're already set up and configured properly. That means you can spend your time on developing a worthwhile presentation for your client, rather than monkeying around with setting a system up.
So before you build a VM, run a quick search on Google to see if there's a pre-built one out there. It can save you lots of time and effort.
Friday, April 3, 2009
Dead Air
When you give lots of presentations, sooner or later something bad will happen while you're in the middle of a demo. Either your machine will crash, or the app you're demo-ing will freeze, or one of any number of things. In times like this, you don't want to be standing in front of your audience just looking goofy. That's why it's always good to have a handful of canned conversation pieces at the ready.
I typically will use the opportunity to gather more information from my audience about what they've done in the past for test automation, or what their biggest challenges are. (Normally you should learn these before you give a presentation, but sometimes that doesn't work out. If you have learned what their challenges are, this gives the audience a great chance to further elaborate on their situation.)
Alternatively, you can use the time to mention certain "side notes" that may be relevant to the audience. These include things like user conferences, training resources, or books & blogs that may help them.
Some folks like to fill this time with jokes, or anecdotes about their kids. Do this with caution. Gauge your audience and decide if that sort of filler is appropriate. I've found that the more I can keep the discussion focused on the audience and their needs the better.
What sort of things have you done to help combat dead air? Drop me a note in the comments
I typically will use the opportunity to gather more information from my audience about what they've done in the past for test automation, or what their biggest challenges are. (Normally you should learn these before you give a presentation, but sometimes that doesn't work out. If you have learned what their challenges are, this gives the audience a great chance to further elaborate on their situation.)
Alternatively, you can use the time to mention certain "side notes" that may be relevant to the audience. These include things like user conferences, training resources, or books & blogs that may help them.
Some folks like to fill this time with jokes, or anecdotes about their kids. Do this with caution. Gauge your audience and decide if that sort of filler is appropriate. I've found that the more I can keep the discussion focused on the audience and their needs the better.
What sort of things have you done to help combat dead air? Drop me a note in the comments
Friday, March 20, 2009
Keep Your Automation Visible
You and your automation are only useful if people know what you're doing. It's always a good idea to make sure the tools and scripts you've created have as much visibility as any other testing activity going on. Otherwise, someday someone's going to say "Hey, what does [insertYourNameHere] do all day, anyway?"
Here are a few thoughts on how you can keep people from asking that question:
Demonstrate your tools. Show the test & dev teams what you're working on. Do it during a lunch & learn and let them see what you've done. People may suggest some ideas on how to improve what you've created. Others may realize they have a use for what you've created. Either way, you're keeping your team informed on what you've built and how it can benefit them.
Send out updates. Between demonstrations, jot off a quick note to your team when you've updated a tool or script and how it can benefit them. When they realize you've added the ability to test foreign languages, or you've made it easier to verify values have been written to the registry, they'll be excited and will put those enhancements to work right away.
Blow your own horn. I hate doing this, but sometimes it's necessary. Your management needs to know that what you're doing is worthwhile, so sometimes you have to send off a mail saying "Hey, that tool I wrote helped Carla find 10 bugs in an hour" or "The smoke test I wrote caught a bunch of problems before it went to the test team, which kept them from wasting their time on a broken build." Especially in these economic times, being invisible is not the way to go.
Solicit feedback. Communication is a two way street. Always ask your test team "How could this be better?" when they see you implemented requests they asked for, they'll be more vested in your tools, and more excited about using them.
What other methods have you used to increase automation's visibility?
Here are a few thoughts on how you can keep people from asking that question:
Demonstrate your tools. Show the test & dev teams what you're working on. Do it during a lunch & learn and let them see what you've done. People may suggest some ideas on how to improve what you've created. Others may realize they have a use for what you've created. Either way, you're keeping your team informed on what you've built and how it can benefit them.
Send out updates. Between demonstrations, jot off a quick note to your team when you've updated a tool or script and how it can benefit them. When they realize you've added the ability to test foreign languages, or you've made it easier to verify values have been written to the registry, they'll be excited and will put those enhancements to work right away.
Blow your own horn. I hate doing this, but sometimes it's necessary. Your management needs to know that what you're doing is worthwhile, so sometimes you have to send off a mail saying "Hey, that tool I wrote helped Carla find 10 bugs in an hour" or "The smoke test I wrote caught a bunch of problems before it went to the test team, which kept them from wasting their time on a broken build." Especially in these economic times, being invisible is not the way to go.
Solicit feedback. Communication is a two way street. Always ask your test team "How could this be better?" when they see you implemented requests they asked for, they'll be more vested in your tools, and more excited about using them.
What other methods have you used to increase automation's visibility?
Monday, March 16, 2009
Your Automated Tests Are Stupid
Yes, they are. And not because you didn't put a lot of effort into planning them, nor are they stupid because you implemented them poorly. Your tests are stupid because the computer is stupid. It' s going to check exactly what you told it to, and only what you told it to. Keep that in mind; if you write a test that verifies a field in your app correctly displays the sum of two numbers, then your test will make sure that the sum of 2+2 = 4. That's great, but the test will miss a lot of things that a manual tester would catch. For example, if the results are suddenly displayed in bright pink; if they're in a font size too small to see; if they're displayed offscreen. Your automated tests will also miss any spelling mistakes in fields adjacent to the field being tested, which a manual tester would (hopefully) notice.
The point I'm trying to make here is that automated tests will not and should not replace your manual testers. Automation should be used to augment your manual testers; in the case above, you could have an automated test verify hundreds of thousands of various input combinations, and then have a manual tester run through once to do a spot check. That way you get the best of both worlds - you're assured that the app is functioning properly from both a technical and a human perspective.
The point I'm trying to make here is that automated tests will not and should not replace your manual testers. Automation should be used to augment your manual testers; in the case above, you could have an automated test verify hundreds of thousands of various input combinations, and then have a manual tester run through once to do a spot check. That way you get the best of both worlds - you're assured that the app is functioning properly from both a technical and a human perspective.
Wednesday, March 11, 2009
Reflecting on a Demo
Last Friday I gave a TestComplete webinar to 314 people. The topic was Creating Script Extensions, which lets people extend the standard capabilities of TestComplete. I think it went really well overall, and I wanted to share a few of the things I learned/found helpful.
Let the audience know where you are in your presentation. I had a really short slide deck; most of the slides were used to track where I was in my agenda. This served as a transition between topics, and helped people keep track of where we were in the overall presentation.
Presenter Mode is a wonderful thing. PowerPoint 2007 has this great feature called Presenter Mode. This lets you have your presentation shown on one monitor, and your slides along with speaker notes and a timer on another. I was able to refer to my notes for each slide without having to print them off. This helped keep the presentation smooth, since I didn't have to go back and forth between a printout and what was displayed on screen.
Always have a cheat sheet. A good chunk of the demonstration involved teaching people how to write the code needed for a script extension. I had a blank Notepad++ window on the monitor I was presenting on, and on my other monitor I had the complete code I was writing, along with comments to help explain the code. This was insurance against me having a complete brain cramp while I was coding.
Load your applications ahead of time. No one wants to watch splash screens load or watch someone stumble through explorer windows and shortcut clicks. Have everything loaded in advance, and the demo goes smoother.
Order your applications in the taskbar. I used several programs during the course of the demonstration; PowerPoint, TestComplete, Notepad++, Paint.NET, Windows Explorer, and a couple of Notepad windows for my cheatsheets. It helps keep things smooth when the items in the Taskbar are ordered according to the order in which they'll be used. That way, you aren't searching for anything. There's a great little utility called TaskBar Shuffle that lets you reorder items in the Taskbar, in case you don't open apps in the order in which they'll be needed.
Poll your audience often. I was giving this demo remotely, so I couldn't see my audience. That meant I had no idea if they were engaged or not. I made it a point to ask them questions as I was presenting. Most of these were simple yes or no questions, but it was enough to keep them involved.
Have a wingman (or wingwoman). I had a co-worker riding shotgun with me during the demo. It was his job to answer questions about the telephone number needed to call in, and to alert me to any audio or visual problems that people were experiencing. This freed me up from having to respond to certain questions during the demonstration, and provided a mechanism to head trouble off before it became serious.
Breathe. I was really excited about giving this session. My energy level was pretty high, and I felt a little out of control in the first few minutes. I made myself slow down and take a breath, and was able to stay on a good pace for the session without rushing through things.
Record yourself. My wingman also recorded the demonstration. We wanted to be able to send this demo to clients who couldn't make it to the live session, but it's also a great tool for self-improvement. For example, I thought I had a really good handle on not saying "um" while presenting, but in listening to the recording I hear myself saying it, especially when I'm answering questions toward the end of the session. I'm going to keep working on that.
If you'd like to take a look at the recording, you can see it here (it's about 90 mins long). I'd welcome any feedback you can give.
Let the audience know where you are in your presentation. I had a really short slide deck; most of the slides were used to track where I was in my agenda. This served as a transition between topics, and helped people keep track of where we were in the overall presentation.
Presenter Mode is a wonderful thing. PowerPoint 2007 has this great feature called Presenter Mode. This lets you have your presentation shown on one monitor, and your slides along with speaker notes and a timer on another. I was able to refer to my notes for each slide without having to print them off. This helped keep the presentation smooth, since I didn't have to go back and forth between a printout and what was displayed on screen.
Always have a cheat sheet. A good chunk of the demonstration involved teaching people how to write the code needed for a script extension. I had a blank Notepad++ window on the monitor I was presenting on, and on my other monitor I had the complete code I was writing, along with comments to help explain the code. This was insurance against me having a complete brain cramp while I was coding.
Load your applications ahead of time. No one wants to watch splash screens load or watch someone stumble through explorer windows and shortcut clicks. Have everything loaded in advance, and the demo goes smoother.
Order your applications in the taskbar. I used several programs during the course of the demonstration; PowerPoint, TestComplete, Notepad++, Paint.NET, Windows Explorer, and a couple of Notepad windows for my cheatsheets. It helps keep things smooth when the items in the Taskbar are ordered according to the order in which they'll be used. That way, you aren't searching for anything. There's a great little utility called TaskBar Shuffle that lets you reorder items in the Taskbar, in case you don't open apps in the order in which they'll be needed.
Poll your audience often. I was giving this demo remotely, so I couldn't see my audience. That meant I had no idea if they were engaged or not. I made it a point to ask them questions as I was presenting. Most of these were simple yes or no questions, but it was enough to keep them involved.
Have a wingman (or wingwoman). I had a co-worker riding shotgun with me during the demo. It was his job to answer questions about the telephone number needed to call in, and to alert me to any audio or visual problems that people were experiencing. This freed me up from having to respond to certain questions during the demonstration, and provided a mechanism to head trouble off before it became serious.
Breathe. I was really excited about giving this session. My energy level was pretty high, and I felt a little out of control in the first few minutes. I made myself slow down and take a breath, and was able to stay on a good pace for the session without rushing through things.
Record yourself. My wingman also recorded the demonstration. We wanted to be able to send this demo to clients who couldn't make it to the live session, but it's also a great tool for self-improvement. For example, I thought I had a really good handle on not saying "um" while presenting, but in listening to the recording I hear myself saying it, especially when I'm answering questions toward the end of the session. I'm going to keep working on that.
If you'd like to take a look at the recording, you can see it here (it's about 90 mins long). I'd welcome any feedback you can give.
Friday, March 6, 2009
You Have to Start Somewhere
I was speaking with a client recently who asked a lot of great questions. He was the manager of a small QA team, and really had a good feel for automation. I addressed everything he asked about, and then he told me that he had one other member of his team who had just a few "quick questions."
After speaking with this person for a few minutes, it was blatantly obvious that not only did he have no clue about test automation, he didn't have a clue about testing in general. To top it off, his initial attitude toward me was rather rude. My first reaction was one of mixed shock and frustration, but then I did what William Ury suggests in his negotiation book "Getting Past No" - I went to the balcony.
Basically this concept means you distance yourself from your emotions and your natural reactions - in essence, you step back from the situation and look at it with different eyes. I tried to put myself in this person's shoes. He was being asked to do something that he wasn't qualified for, and he was probably frustrated about that. Given today's economy, he was probably also scared that if he couldn't grasp these concepts, he might be out of a job.
So I stepped back and listened to his concerns and what he was trying to achieve. Then I addressed each concern in turn, and gave him lots of opportunities to ask questions or get clarification. The end result was a successful session. He admitted he had a lot of learning to do, but he felt really good about his chances to succeed.
My takeaway here is that everyone has to start somewhere. After doing test automation for so long, it's easy to forget that there was once a time when I didn't know a thing about it. I've posted a note above my desk that says "You were an English Lit Major" to remind myself that I asked all those really basic questions once too, and that perspective is part of the key to making other people successful as they learn about testing and test automation.
After speaking with this person for a few minutes, it was blatantly obvious that not only did he have no clue about test automation, he didn't have a clue about testing in general. To top it off, his initial attitude toward me was rather rude. My first reaction was one of mixed shock and frustration, but then I did what William Ury suggests in his negotiation book "Getting Past No" - I went to the balcony.
Basically this concept means you distance yourself from your emotions and your natural reactions - in essence, you step back from the situation and look at it with different eyes. I tried to put myself in this person's shoes. He was being asked to do something that he wasn't qualified for, and he was probably frustrated about that. Given today's economy, he was probably also scared that if he couldn't grasp these concepts, he might be out of a job.
So I stepped back and listened to his concerns and what he was trying to achieve. Then I addressed each concern in turn, and gave him lots of opportunities to ask questions or get clarification. The end result was a successful session. He admitted he had a lot of learning to do, but he felt really good about his chances to succeed.
My takeaway here is that everyone has to start somewhere. After doing test automation for so long, it's easy to forget that there was once a time when I didn't know a thing about it. I've posted a note above my desk that says "You were an English Lit Major" to remind myself that I asked all those really basic questions once too, and that perspective is part of the key to making other people successful as they learn about testing and test automation.
Wednesday, March 4, 2009
The Automation Mantra
I love tools. I love writing them, I love learning how to make the computer do more work for me. But it's easy to get caught up in the excitement of development and lose sight of what automation's really about. That's why I've adapted a mission statement that an old boss of mine had. Whenever I find myself writing a new app, or looking for a place to apply something I've created, I take a step back and say to myself -
"My job is to provide the QA team with helpful, easy to use tools. The output of these tools should be simple to understand and share."
If what I'm doing doesn't meet the above criteria, then I know I'm doing something wrong. It's a simple sanity check, but it keeps me from building things that aren't worthwhile. There are a lot of people out there who have been soured on automation because their experiences were with tools that did not follow these guidelines.
So when you're building tools for others, make sure they are:
Helpful - don't build a solution in search of a problem. Make sure you're really filling a need.
Easy to Use - don't make it harder to use the tool than to run the test without it.
Easy to Understand Output - put human readable English in your output messages, and store those messages in a file format that's easy to read. I've found XML/HTML are usually the best bets; PDF isn't bad either.
What other guidelines do you follow with your automation mantra?
"My job is to provide the QA team with helpful, easy to use tools. The output of these tools should be simple to understand and share."
If what I'm doing doesn't meet the above criteria, then I know I'm doing something wrong. It's a simple sanity check, but it keeps me from building things that aren't worthwhile. There are a lot of people out there who have been soured on automation because their experiences were with tools that did not follow these guidelines.
So when you're building tools for others, make sure they are:
Helpful - don't build a solution in search of a problem. Make sure you're really filling a need.
Easy to Use - don't make it harder to use the tool than to run the test without it.
Easy to Understand Output - put human readable English in your output messages, and store those messages in a file format that's easy to read. I've found XML/HTML are usually the best bets; PDF isn't bad either.
What other guidelines do you follow with your automation mantra?
Tuesday, March 3, 2009
Be an Educator, not an Enabler
This weekend I set up a computer at a relative's house. I unboxed it, plugged it in, went through the Windows setup screen. I set the timezone, removed some of the preinstalled crap-ware and finally installed MS Office. The entire process only took about 35 mins, but when I went home that night, I was disappointed in myself; I had missed out on an opportunity to teach my family how to do something with a computer.
Let's be real here folks, how hard is it to set up a new PC? This one had 3 wires. The CPU power cord, the monitor power cord, and the cable that connected the monitor to the PC. The keyboard and mouse were wireless, and just needed batteries installed. Loading Office is just a matter of inserting a DVD and clicking "Next" a few times. I could have walked them through the process and then they'd have felt smarter and more confident with the technology. Instead, I've inadvertantly fostered the belief that working on computers is hard, and borders on black magic.
It's like that with test automation, too. People who don't have an understanding of what it does and how it works see it as some sort of dark art. It's up to us to educate them so that they learn the right way to do things, rather than us always doing it for them. Good teachers help people figure out the answers for themselves, rather than give the answers to them. I'm going to try and be more like that in the future.
What have you done to help educate your team/group on automation? Have you been an educator or an enabler?
Let's be real here folks, how hard is it to set up a new PC? This one had 3 wires. The CPU power cord, the monitor power cord, and the cable that connected the monitor to the PC. The keyboard and mouse were wireless, and just needed batteries installed. Loading Office is just a matter of inserting a DVD and clicking "Next" a few times. I could have walked them through the process and then they'd have felt smarter and more confident with the technology. Instead, I've inadvertantly fostered the belief that working on computers is hard, and borders on black magic.
It's like that with test automation, too. People who don't have an understanding of what it does and how it works see it as some sort of dark art. It's up to us to educate them so that they learn the right way to do things, rather than us always doing it for them. Good teachers help people figure out the answers for themselves, rather than give the answers to them. I'm going to try and be more like that in the future.
What have you done to help educate your team/group on automation? Have you been an educator or an enabler?
Thursday, February 26, 2009
Obfuscation Question
Many companies employ obfuscation to keep their code from being reverse engineered. But this leads to an interesting problem, namely, a GUI that's been obfuscated can't be automated. The only workaround I've been able to come up with for GUI automators is to write & run tests against non-obfuscated code. The problem arises when a manager says "But then we aren't testing the same thing that we ship to our customers"
This got me thinking about how obfuscation impacts other kinds of testing. I would think that obfuscation would impact any tests that were run after the obfuscation process. So is it safe to say that companies who employ unit tests must be running those tests prior to the obfuscator running? If so, should the manager be concerned that the unit tests were run against non-obfuscated code?
Obfuscation would also prevent any other type of post-build white box testing; so any tools I've written that call methods directly from dlls or invoke web service calls would be stopped dead.
So what's the best solution here? Is it to do the bulk of your automated testing against non-obfuscated code and then do manual sanity checks against an obfuscated build? Or is there a better way that I'm not seeing? Please let me know in the comments.
This got me thinking about how obfuscation impacts other kinds of testing. I would think that obfuscation would impact any tests that were run after the obfuscation process. So is it safe to say that companies who employ unit tests must be running those tests prior to the obfuscator running? If so, should the manager be concerned that the unit tests were run against non-obfuscated code?
Publish Post
Obfuscation would also prevent any other type of post-build white box testing; so any tools I've written that call methods directly from dlls or invoke web service calls would be stopped dead.
So what's the best solution here? Is it to do the bulk of your automated testing against non-obfuscated code and then do manual sanity checks against an obfuscated build? Or is there a better way that I'm not seeing? Please let me know in the comments.
Wednesday, February 25, 2009
Pointers on Pointing
Gestures are a powerful part of any presentation. However, you have to be careful when you point at objects or people to ensure your body language doesn't send the wrong message. Here are a few things I've learned when pointing during presentations:
Don't use a laser. I love giving presentations; I get really excited when I'm in front of people. But that excited energy tends to make my hands shake a little. It's enough that I can't quite focus a laser dot on a single point without it bouncing a bit. If you're really nervous about being in front of people, the dot may dance all over the screen. That's distracting to your audience, and it takes away from the message you're trying to deliver. (Plus we've all seen those comics where the guy with the laser pointer accidentally beheads someone in his audience...)
Point at slides, point at objects, but don't point at people. We're all taught in school that it's not polite to point. Whether you intend it or not, pointing at people implies some measure of accusation. You want to invite your audience in to your message, not make them feel like they're on trial. So instead of pointing at someone, use an open hand with palm up and fingers out. This is a welcoming and encouraging gesture that brings people in rather than putting their backs up.
Use the mouse sparingly. If your mouse is bouncing around your screen throughout the presentation, people are going to get dizzy. Peter Cohen calls this "Pointer Palsy", and it's extremely distracting. Perform your actions slowly, so the mouse isn't racing around the screen. If you are giving a remote demo, you can gesture with the mouse, but use these gestures sparingly. If you don't need to gesture at something, take your hand off the mouse. You'd be surprised how many of us move the mouse around without realizing it or intending to.
Are there other "pointers" you've learned? If so, please share them in the comments.
Don't use a laser. I love giving presentations; I get really excited when I'm in front of people. But that excited energy tends to make my hands shake a little. It's enough that I can't quite focus a laser dot on a single point without it bouncing a bit. If you're really nervous about being in front of people, the dot may dance all over the screen. That's distracting to your audience, and it takes away from the message you're trying to deliver. (Plus we've all seen those comics where the guy with the laser pointer accidentally beheads someone in his audience...)
Point at slides, point at objects, but don't point at people. We're all taught in school that it's not polite to point. Whether you intend it or not, pointing at people implies some measure of accusation. You want to invite your audience in to your message, not make them feel like they're on trial. So instead of pointing at someone, use an open hand with palm up and fingers out. This is a welcoming and encouraging gesture that brings people in rather than putting their backs up.
Use the mouse sparingly. If your mouse is bouncing around your screen throughout the presentation, people are going to get dizzy. Peter Cohen calls this "Pointer Palsy", and it's extremely distracting. Perform your actions slowly, so the mouse isn't racing around the screen. If you are giving a remote demo, you can gesture with the mouse, but use these gestures sparingly. If you don't need to gesture at something, take your hand off the mouse. You'd be surprised how many of us move the mouse around without realizing it or intending to.
Are there other "pointers" you've learned? If so, please share them in the comments.
Monday, February 23, 2009
Notes on Demos
I give lots of demos every day, and I'm always on the lookout for things to make them better/smoother/cleaner. So the next time you have to give a technical presentation, here are some (hopefully) helpful tips:
Turn off all unnecessary applications. You don't want that IM from your significant other coming in during the middle of a discussion. Turn off your email, chat, twitter and any other programs that aren't relevant to your demonstration.
Remove any potentially offensive/confidential material from your recent items. The pics your brother-in-law sent of his latest Vegas exploit may be cool/funny, but they're not professional. On a related note, be professional when naming your computer. During a presentation I sat in on last week, the presenter's laptop was named WideAss. And since the name of his machine was prominently displayed in several places of the app being demonstrated, it's not like you could miss it. That's just not professional, and I couldn't ignore it.
Make it readable. If you're using slides, keep the font big. If you have to shrink the font to get everything to fit, edit, delete, or split it into multiple slides. If you're demo-ing code, keep that big, too. Scott Hanselman suggests using Lucinda Console 14 -18 pt font for all code demonstrations, and I've found that works very well. You lose your rythym as a presenter very quickly when you have to keep asking if everyone can read what's on your slides.
Finally, and most importantly practice! Spend some time saying exactly what you're going to say. Don't go halfway on this one; sitting in front of your slides saying "ok, now I'll talk about X" will not make an effective presentation. Say exactly what you're going to say, exactly how you want to say it. If that means you need to retreat to a conference room or hide in your car at lunch break, then do it. The more familiar you are with what you're going to say, the more confident you'll come across as.
I'll post more demo tips as I can, but what else have you done to make your presentations more effective?
Turn off all unnecessary applications. You don't want that IM from your significant other coming in during the middle of a discussion. Turn off your email, chat, twitter and any other programs that aren't relevant to your demonstration.
Remove any potentially offensive/confidential material from your recent items. The pics your brother-in-law sent of his latest Vegas exploit may be cool/funny, but they're not professional. On a related note, be professional when naming your computer. During a presentation I sat in on last week, the presenter's laptop was named WideAss. And since the name of his machine was prominently displayed in several places of the app being demonstrated, it's not like you could miss it. That's just not professional, and I couldn't ignore it.
Make it readable. If you're using slides, keep the font big. If you have to shrink the font to get everything to fit, edit, delete, or split it into multiple slides. If you're demo-ing code, keep that big, too. Scott Hanselman suggests using Lucinda Console 14 -18 pt font for all code demonstrations, and I've found that works very well. You lose your rythym as a presenter very quickly when you have to keep asking if everyone can read what's on your slides.
Finally, and most importantly practice! Spend some time saying exactly what you're going to say. Don't go halfway on this one; sitting in front of your slides saying "ok, now I'll talk about X" will not make an effective presentation. Say exactly what you're going to say, exactly how you want to say it. If that means you need to retreat to a conference room or hide in your car at lunch break, then do it. The more familiar you are with what you're going to say, the more confident you'll come across as.
I'll post more demo tips as I can, but what else have you done to make your presentations more effective?
Friday, February 20, 2009
Portable Tools
There are times when your test machines may not have all the tools on them that you need to perform a test effectively. Sometimes you're not allowed to install other apps on a system due to a security concern. Luckily, many applications can be run from a flash drive. This lets you get your job done while not contaminating the system. Here's a quick list of some of my most frequently used portable apps. You can download a full suite of programs from PortableApps.com, and many more from PortableFreeware.com
Notepad++ - lightweight editor for pretty much any type of script.
Recuva - lets you quickly recover deleted files.
Wireshark - packet monitoring tool useful when load/performance testing.
PortablePython - a full python development environment on your flash drive.
FoxitPDF Reader - because sometimes, Acrobat just isn't there.
QueryExpress.exe - simple SQL query analyzer look alike.
WinMerge Portable - lets you quickly perform a diff between two files.
What other portable tools do you use?
Notepad++ - lightweight editor for pretty much any type of script.
Recuva - lets you quickly recover deleted files.
Wireshark - packet monitoring tool useful when load/performance testing.
PortablePython - a full python development environment on your flash drive.
FoxitPDF Reader - because sometimes, Acrobat just isn't there.
QueryExpress.exe - simple SQL query analyzer look alike.
WinMerge Portable - lets you quickly perform a diff between two files.
What other portable tools do you use?
Subscribe to:
Posts (Atom)