Fear and Testing

I’m encouraged and excited by what I see coming from user-led conferences. I’ve had great first-hand experiences at the Java Posse Roundup (rumor is February 21-25 this coming year), No Fluff Just Stuff, and StrangeLoop (2010); and I’ve really benefitted from materials published on-line by other conferences, such as Devoxx and—a recent discovery—JRubyConf.

I was particularly impressed and inspired by Must. Try. Harder. (also linked from here and here), a presentation by Keavy McMinn, who described her experiences training for Ironman. Part of Keavy’s first blog post on Ironman Cuzumel resonated strongly with something I once saw in some code, hence this article.

By way of background (with vagueness required by confidentiality considerations), I remembered looking at test code wrapped around a fairly subtle bit of business logic that had passed through multiple hands. I believed that I understood the intentions of the original designer, and wanted to confirm or correct that belief. All of which sounds like a perfect fit for test cases, right?

Hmmmmm.

The majority of the test cases dealt with various ways that something could go wrong in configuring the targeted business component. Additional tests dealt with incorrect inputs to a correctly-configured instance, and only a small fraction actually dealt with the behavior of a correctly-configured instance. In fact, there were no tests that answered the particular questions that I had in mind.

How can this be? (And what does this have to do with Keavy’s Ironman experience?)

About half-way down in her Ironman Cozumel blog entry, there’s a section entitled “Fear”. I thought about quoting a key sentence or two, but didn’t want to interfere with her well-crafted flow. So just take a minute and read it (at least—for now—the section under her “Fear” heading, between the two photos).

(Wow! You read quickly! 😉 )

So, as I looked at the test cases, a pattern began to emerge. The largest portion of the tests seemed to orbit around a particular aspect of the configuration that I took for granted, giving that configuration concept far more than its share of attention, IMHO. Which got me to thinking about similar patterns in other test cases. Which prepared me to have an “<em.Aha!</em.” moment when I encountered Keavy’s description of fear of the ocean.

Fear

Unbalanced tests that over-emphasize some issues and under-emphasize others may be a subtle hint that fear is influencing the work. I’ve observed two common responses to fear in myself and others: obsession and avoidance. On reflection, I believe that those attitudes show up in tests that I’ve looked at over the years.

120px-Cyanoacrylate_structure.png

Obsession

We may joke about it, but I’ll admit to going back a few times to make sure the coffee pot is off, jiggling the knob one more time to verify that I really did lock the door, or repeatedly checking my pocket to reassure myself that my airline tickets didn’t jump out and hide under the couch. (OK, forget the “hide under the couch” part.)

In this light, I see the set of test cases mentioned above as being obsessed with a particular configuration issue. Those tests were me, patting my pocket again, trying the door again, pushing the switch again (even though I can clearly see that the light is <em.not</em. glowing…) If I find that my tests appear to be obsessing over a particular aspect of the task at hand, that might be a symptom of fear—a need for comfort and reassurance in the face of of an unfamiliar domain concept, a language feature or API with which I’m uncomfortable, or a requirement that I don’t really understand. Or of moving onto the next use case, which brings me to…

SinksLeap.jpg

Avoidance

Inadequate attention to testing an aspect of the design may indicate a reluctance to engage fully with the issue. In that connection, Keavy’s description of initial hesitation at the water’s edge brought back a vivid memory for me.

My family loves camping in the Great Smoky Mountains National Park. A popular spot along the Little River Road called “The Sinks” features a waterfall almost directly under a bridge, followed by a long, deep swimming hole with steep rocky banks on both sides. Visitors can climb to a ledge which offers a clear jump into a deep part of the pool. Although that ledge is only about 12 feet above the water level, it looks much higher to a first-time jumper. I don’t laugh at young (or not-so-young) first-timers who hesitate on that ledge, because I still remember my first leap.

Even though my head knew it must be safe—I had seen many people take the plunge before me that day—my stomach and legs hadn’t yet gotten the message. I “paused” for several long moments before stepping into the air.

(I’m not talking about truly high-risk activity here, I should point out. A friend of mine has a truly terrifying story about jumping from height into unfamiliar—and unsafe—water and impaling his foot. So let’s keep it in the pool and well-known swimming holes, kids. And don’t forget your buddy.)

Of course, we developers <em.never</em. procrastinate in the face of a fear-inspiring task! Then again, there’s that bit of legacy code that nobody wants to maintain, much less rewrite. And the lingering suspicion that “starting with the low-hanging fruit” can turn into an excuse to put off the parts we’re secretly fearing.

So, how do I fight back, when an imbalance in my tests shows evidence of fear, either positively (obsession) or negatively (avoidance)?

Suggested antidotes…

First, I should consider whether the test really are over- or under-emphasizing something. Different aspects of the code have different levels of risk and/or consequences, so I don’t expect absolutely even distribution of attention. If there is a true imbalance—even better, if I’m about to create one—I need to recognize that and apply an appropriate antidote.

…to obsession

When I start bogging down, these questions can be useful:

Is this test about the product or about me?
Let me tip my hat to Ruby Koans, which gives beautiful evidence of the power of testing as a way to explore a language or framework. If a test can confirm (or correct) my understanding of something I’m using in this project, I certainly should write it! But I’m equally certain that I should keep it outside my project’s code base.
What does/will this test teach me?
If it conveys useful new information, or affects the code in a material way, fine. But if it’s just a trivial variation on a theme, perhaps I should move on. If I’m wrong, I can always come back and add a test later.
How likely is this?
If this test is exercising a scenario that would almost certainly never occur in normal use, maybe I should take care of the more-likely cases first.
How bad can it be?
Do I really need this last little bit of yak hair?

…to avoidance

Keavy wrote about overcoming by focusing on technique, process, and details under our control. My high-school marching band director kept taking us back to the basics, with scales on our instruments and eight-to-five marching drills until our feet could hit the chalk line precisely, with no last-minute stretches or stutters. In fact, without even looking. For this side of my fears, here are some questions…

What do I not know?
If I can express it as a test, I can get moving.
How would I explain this to ___?
I have gotten past mental blocks by explaining my impasse to a friendly non-programmer. Being clear and non-technical often has amazing benefits! 😉 Fill in the blank with a name of your choice. Sometimes only imagining the conversation is enough. It also helps to imagine it out loud (but not where I could frighten or distract my colleagues!)
Am I trying to swallow an elephant?
The last time I got stalled out, I let circumstances outside the task at hand stampede me. I blurted out a train of thought into code and quickly ended up, as I once heard someone say, “crocheting in logic space“. Admitting that fact to myself allowed me to back out of my self-imposed impasse.

…and a couple of challenges:

Don’t just sit there, DO SOMETHING!
Sometimes even a stumbling start is better than stalling, because it gets me moving. As with explaining my problem to a non-programmer, thinking out loud in code may help me get a better perspective (even if I end up throwing away that first bit).
Don’t just do something, SIT THERE!
In my best ersatz-mystico-philosophical style, sometimes I need to do just the opposite. Stop fidgeting and pacing back and forth. Sit down. Close my eyes. Be still. Take a breath. Think a moment. In that jam there’s probably one log that I can shift a little, and it’s probably not the one I’ve been tugging on unsuccessfully for way too long. Now open my eyes. Look around not in the same place. Oh. Duh. There it is. 

Let me get back to you. I’ve got a test to write.

The value of persistence

After beginning this post, I learned that Keavy finished in Cozumel! Congratulations!


Code koans:

I’m sure there are many more; please let me know.


Trackbacks are closed, but you can post a comment.

Leave a comment