The Agile Architect
A Visit to the Automated Testing Zoo
Our Agile Architect wants to talk about lessons learned from his experiences with test automation. As usual, he does it in his own unique way.
- By Mark J. Balbes, Ph.D.
- August 15, 2022
As a software developer, I was always keenly aware of the importance of automated tests to the quality of my code. But it wasn’t until I was a brand-new wet-behind-the-ears VP that I understood its higher value. It was early in my tenure when my QA director came into my office.
“We're ready to push to production,” she said. “We just need your final approval.”
“Are we sure it’s going to work?” I asked.
“Sure,” she said. “We tested it.”
I'd been in the job long enough to know that by “We tested it,” she meant that a couple of QA people had done some unscripted, ad-hoc manual testing on the app. I was being asked to decide, with no objective evidence, that the system worked. And even if I believed it had been thoroughly tested, it had still been tested only once in an uncontrolled QA environment.
I approved the release because, well, what else could I do? But I was scared.
Test Me at the Zoo
While I was thinking about this issue, this story popped into my head. (Apologies in advance.)
Dan: Hello, folks. This is Dan Dontu Dare coming to you live from the Metropolitan Test Zoo, known for the diversity of the creatures it exhibits, and the live, realistic environments it uses to contain them. Today, the zoo is opening its new premiere exhibit, which has brought together a wide varieties of Tests from all over the world. Looking around, I can see Functional Tests and Non-functional Tests alike.
I can see a big crowd over at Functional Test Land. Let’s stop in and see how the Click-and-Record Tests are doing. There certainly seem to be a lot of them! They're are not very sophisticated beasts, but they are fun to play with. Just a little instruction and even children can handle them safely. Say, those little tykes sure like to push those buttons.
Now let’s head over to the Automation-As-Code Tests exhibit., I don’t know who names these beasts, but that one is a mouthful. You'll notice there’s quite a bit of distance between us and them. Only the trainers, highly specialized with years of experience, are allowed near these Tests. I’m told if they aren’t cared for properly, they can grow sickly. "Brittle" was the term I believe one of the trainers used. Once that happens, it can take tremendous effort to help them recover. And unfortunately, sometimes the Test just can’t be saved. It’s a shame and a great loss to the vitality and diversity of the Test Suite when that happens.
That’s right, folks. A pack of tests is called a "suite." We learn something new every day!
Now I’m going to hand you off to my colleague, Patricia Patty (we call her "Pat"), who's standing over by the Regression Test Train. Over to you, Pat!
Pat: Thanks, Dan. We were over here earlier today and there was a huge line, but it doesn’t look like many people are here now. I see someone in line walking away. Ma’am! Patricia Patty, Mid-Day News. Can you tell our audience who you are and why you left the line for the Regression Test Train?
Ruby: Hi Miss Patty. I’m Ruby. Big fan. And to answer your question, I just got tired of waiting. That Regression Test Train is so slow, it takes forever. We were promised a fun ride, a fantastic opportunity to see all the different Tests in action, shooting through the CI/CD pipeline at breakneck speed, and then finally a dizzying drop into Production Lake. Very disappointing.
Pat: Where are you headed now?
Ruby: I think I’ll go to the AI/ML exhibit. No one knows what it is, so it must be powerful stuff.
Dan: Pat, we're going to break away from you now. People up ahead of us have started running in all directions. I can't see what they're running from… Wait! I see it now! It looks like a Load Test has broken loose! These are large beasts, and this one looks angry. I don’t know how it escaped its environment, but whoever set it up obviously didn’t lock it down very well. There looks to be a lot of collateral damage. It’s going to take a lot of effort to clean up this mess. Someone really screwed up! Oh god…
Pat: Dan, while you're running for your life, allow me to show our audience the Test Recovery Center. This is where the zoo treats failing tests. Over here, we see an Integration Test. Poor guy. They don’t know what’s wrong with him. He seems fine one minute, and he's failing the next. They can’t figure out why. The staff has pretty much given up. At some point, they may have to… well… you know. It just takes too much time and effort to chase after his intermittent problematic behavior.
Dan: Hi folks! I’m back. Boy, that was some pretty scary stuff. That Load Test got loose and made its way all the way to Production Lake. It started going after all the folks enjoying Production. Pretty much chased everyone away. It’ll be a while before they come back.
Meanwhile, I’ve managed to make my way to Performance Test Park. Just as at Load Test Loft, there’s an extra admission fee and required training before entering. I can already see that there are many fewer people here due to the higher barrier to entry. I saw some folks trying to sneak in. I don’t know what happened to them while they were in there, but they looked pretty unhappy on the way out.
Over to you , Pat!
Pat: Thanks, Dan! It just wouldn't be a trip to the Met without a stop at the Whip-the-Test! Ride. Unfortunately, they seem to be having some problems. The developers of the ride wanted to incorporate real Tests into it. But apparently, they used so many Unit and Integration Tests that the ride just won’t move. It’s become intractable to any change or reconfiguration. The ride's developers were going to try to remove some of the Tests, but they decided it would be quicker to rebuild the whole thing.
Dan: Wow, Pat! That’s a lot of money to sink into that thing just to throw it all away.
Well, that’s it for today, folks. We're out of time. I hope you enjoyed your visit with us to the Automated Testing Zoo. Join us next week when we explore the Isle of Indivisible DevOps!
For those who are totally confused, but still reading (and why are you doing that?), here are the points I'm trying to make:
- Click-and-record tests can provide value and have a low barrier to entry for less technical people.
- Automation-As-Code tests need more technical people. Writing automated tests is itself a software development activity.
- If not written well, tests can become brittle, breaking when code changes or even from environment or timing differences. When tests fail intermittently, it’s best to jettison them until they can be stabilized. When a test fails intermittently, the build fails intermittently, and then people start ignoring all test failures.
- Make your tests run fast. Minutes, not hours. Run tests in parallel if you can’t speed up individual tests enough. Slow tests don’t get run by developers on their development machines and don’t provide quick enough feedback from the build machine.
- Isolate the test environments from production. Let me say this once again. Isolate the test environments from production. Completely!
- Kill intermittently failing tests. They create too much uncertainty about the validity of test results in general.
- Performance testing, load testing, and other kinds of non-functional testing are just as important as functional testing but there is a higher level of technical expertise required.
- Poorly-written unit and integration tests can be a straitjacket on your code making it difficult if not impossible to refactor to a better design.
- There’s a big debate about whether state-based testing or interaction-based testing is better. It’s state-based. Don’t test how the code works. Test that it is giving you the right results. If you test how the code works, you can’t change how the code works.
About the Author
Dr. Mark Balbes is Chief Technology Officer at Docuverus. He received his Ph.D. in Nuclear Physics from Duke University in 1992, then continued his research in nuclear astrophysics at Ohio State University. Dr. Balbes has worked in the industrial sector since 1995 applying his scientific expertise to the disciplines of software development. He has led teams as small as a few software developers to as large as a multi-national Engineering department with development centers in the U.S., Canada, and India. Whether serving as product manager, chief scientist, or chief architect, he provides both technical and thought leadership around Agile development, Agile architecture, and Agile project management principles.