C. Keith Ray

C. Keith Ray writes about and develops software in multiple platforms and languages, including iOS® and Macintosh®.
Keith's Résumé (pdf)

Monday, December 29, 2008

William Wake described the essence of an automated test as Arrange, Act, Assert. I've added "Erase", to account for the clean-up that some tests have to do. You might ask why "Erase" added to "Arrange, Act, Assert" and not some word starting with "A". I think starting with "eh?" is close enough. :-)

"The thing about elves is they've got no ... begins with m," Granny snapped her fingers irritably."

"Manners?"

"Hah! Right, but no."

"Muscle? Mucus? Mystery?"

"No. No. No. Means like ... seein' the other person's point of view."

Verence tried to see the world from a Granny Weatherwax perspective, and suspicion dawned.

"Empathy?"

"Right. None at all. Even a hunter, a good hunter, can feel for the quarry. That's what makes 'em a good hunter. Elves aren't like that. They're cruel for fun [...]"

—Terry Pratchett, Lords and Ladies


In C++, with certain test frameworks, a test might be specified in a manner something like the following in C++.



TEST(TestBlurImageFilter)
{
// Arrange
string outfileName = NewTempFileName("TestImageFilter");
Image* sourceImage = new Image("lena.png");

// Act
ImageFilter* filter = new BlurImageFilter();
filter->ProcessToFile(sourceImage, outfileName);

// Assert
AssertImagesEqual("expected_lena_blurred.png", outfileName);

// Erase
DeleteTempFile(outfileName);
delete filter;
delete sourceImage;
}


Note: generally you don't want to deal with files in unit tests; working in-memory would be much faster. Also, if this is one of those frameworks that throws an exception, or otherwise aborts the test if an assertion fails, then the "Erase" portion of the test won't get executed if AssertImagesEqual failed. Let's assume that's not a problem for the moment.

Let's imagine that you then write a another test like so:



TEST(TestUnblurImageFilter)
{
// Arrange
string outfileName = NewTempFileName("TestImageFilter");
Image* sourceImage = new Image("lena.png");

// Act
ImageFilter* filter = new UnblurImageFilter();
filter->ProcessToFile(sourceImage, outfileName);

// Assert
AssertImagesEqual("expected_lena_unblurred.png", outfileName);

// Erase
DeleteTempFile(outfileName);
delete filter;
delete sourceImage;
}


Now you've got duplicated "Arrange" and "Erase" sections. And duplicated logic in tests can be just as bad it would be in production code. Fortunately, most test frameworks already have support for extracting "Arrange" and "Erase" to methods in a "test fixture". The above code could be refactored to something like the following:


class ImageFilterTests : public TestFixture
{
public:
ImageFilterTests()
: sourceImage(NULL), filter(NULL)
{
}

string outfileName;
Image* sourceImage;
ImageFilter* filter;

virtual void SetUp()
{
// Arrange
outfileName = NewTempFileName("TestImageFilter");
sourceImage = new Image("lena.png");
}
virtual void TearDown()
{
// Erase
DeleteTempFile(outfileName);
delete filter;
delete sourceImage;
}
};

TEST_F(ImageFilterTests, TestBlurImageFilter)
{
// Act
filter = new BlurImageFilter();
filter->ProcessToFile(sourceImage, outfileName);

// Assert
AssertImagesEqual("expected_lena_blurred.png", outfileName);
}

TEST_F(ImageFilterTests, TestUnblurImageFilter)
{
// Act
filter = new UnblurImageFilter();
filter->ProcessToFile(sourceImage, outfileName);

// Assert
AssertImagesEqual("expected_lena_unblurred.png", outfileName);
}


Not only has this eliminated the duplicated logic, most unit test frameworks will also guarantee running the TearDown method even if the test fails, so you don't have to write your own try/catch blocks or other contortions for exception-safe "erase".

You'll see that I also added a constructor to insure that the pointer variables have valid NULL values so we don't delete garbage pointers if the Image or Filter objects were not allocated successfully. (You should consider using boost::shared_ptr and/or boost::scoped_ptr if you're dealing with object pointers in C++ code and tests, by the way.)

In those C++ test frameworks where the test-fixture creation and deletion is done just before and after executing the test, the SetUp and TearDown methods can (almost always) be replaced with a constructor and destructor instead. Using that and boost::scoped_ptr to insure exception-safe object deletion would allow us to write the following code:



class ImageFilterTests : public TestFixture
{
public:
ImageFilterTests()
: outfileName(NewTempFileName("TestImageFilter")),
sourceImage(new Image("lena.png"))
{
// Arrange
}

virtual ~ImageFilterTests()
{
// Erase
DeleteTempFile(outfileName);
}

string outfileName;
boost::scoped_ptr sourceImage;
boost::scoped_ptr filter;
};

TEST_F(ImageFilterTests, TestBlurImageFilter)
{
// Act
filter.reset(new BlurImageFilter());
filter->ProcessToFile(sourceImage.get(), outfileName);

// Assert
AssertImagesEqual("expected_lena_blurred.png", outfileName);
}

TEST_F(ImageFilterTests, TestUnblurImageFilter)
{
// Act
filter.reset(new UnblurImageFilter());
filter->ProcessToFile(sourceImage.get(), outfileName);

// Assert
AssertImagesEqual("expected_lena_unblurred.png", outfileName);
}

Monday, December 15, 2008

C++ Mocking Framework

Google released Google Test for C++ earlier this year, and just recently released Google C++ Mocking Framework.

These are extensively documented... check them out!

Tuesday, December 9, 2008

What a Tangled Web We Weave, When Don't Have Effective Tracking

This is a re-post of a blog entry I wrote in 2003...

Steve Norrie points to a recent online article by Jerry Weinberg published on CrossTalk, the Journal of Defense Software Engineering here: Destroying Communication and Control in Software Development.

I'd like to mention a few of the Extreme Programming (XP) solutions to some of the problems mentioned in this article. Be aware that XP is a light-weight process, relying on people rather than technology to do the right thing. (A savvy person can undermine technology anyway.)

Requirements. One of the first areas that communication can be destroyed is by not doing requirements well... not involving the customer, thinking that requirements are a waste of time, and so on. Extreme Programming recommends involving the customer, or a qualified representative of the customer, throughout the entire project. And in addition to talking about the requirements often and in detail, XP requires writing the requirements down in an executable form - automated acceptance tests.

The configuration management system (CMS). The CMS tracks requirements, design, code, test data, test results, user documentation, etc. Jerry notes that information in the CMS can be undermined by failing to keep it up to date, restricting read-access from people who should have access, removing data or failing to put data into it. Extreme Programming doesn't require any specific CMS system (software or otherwise) for this tracking, but does suggest using the simplest thing that actually works. For code, tests, and test data, I recommend using CVS or some other source-code-management system. For acceptance test results, many XP teams record those on a white board or poster board, updated them weekly or daily, charting them over the course of the project. XP does strongly recommend using index cards (story cards) for tracking requirements during initial and weekly planning, and also strongly recommends documenting the relationship between the automatic acceptance tests and the story cards. The person playing the Customer role, as well as the whole team, is responsible for keeping track of the stories. Many XP teams also record the stories in web-accessible ways, such as a wiki.

In regards to a bug-tracking database, some XP teams track bugs the same way as stories - index cards and acceptance tests. You might think that this won't work, but consider that several XP teams have reported that their bug rate after implementing XP dropped from hundreds per six months to around one bug per month. It helps that bugs are not usually recorded until after a story is finished, and in XP, a story is not finished until it is passing its acceptance tests - this requires conversation between the tester and coder as soon as problems are noticed during the implementation and testing of the story. When a bug is recorded, it probably indicates that an acceptance test that was passing has started failing.

Weinberg recommends that you "set and enforce a policy of complete and open information at all times." Agile processes like XP need accurate information daily. Many projects keep tracking information on poster boards and white boards, visible not only to all team members, but anyone in management who walks by. This is the "Project Progress Poster" concept that Weinberg recommends in Quality Software Management: Anticipating Change. Since in my company, few people in management walk by, we keep tracking information in our wiki web pages.

Quality Assurance. Weinberg recommends "Prevent these abuses by having quality assurance report to the highest levels of management, and not to project management." In XP, QA testers are delegates of the person playing the Customer role -- the same person who defines the requirements. QA testers should not only be implementing and running the automated acceptance tests, but also running stress tests and manual testing of the product's user interface.

Weinberg reports that testing often comes too late in the project to be useful in effective risk management. (There's a phrase in XP circles: "Doctor, it hurts when I do this..."). Don't wait until it is too late. XP requires testing to start in the first iteration of the project -- the first week. This and other XP practices enables effective risk management.

The XP solutions noted here require that project management be willing to face reality at all times. One of the quickest routes to failing with XP is to not do XP. If project management destroys information, hides it, degrades it, or inserts misleading inforation, intentionally or not, it is going to be very difficult to have a successful project, no matter what the methodology used.