Why I’ll Be Back at STARWEST

I’ll be returning to Anaheim next month to attend STARWEST1. I’ve been a couple times previously and have generally found it to be a useful, fun experience.

In looking at this year’s program, several topics/speakers stand out. I’ve found great value in the half- and full-day tutorials. One thing I like about STARWEST is that during the “regular” conference talks there are five or six tracks being offered, which generally means there will be something of value in each time slot.

Beyond the program (and this really applies to all conferences, not just this one) is the other key value in these events: the informal networking and conversations. Whether it’s over lunch at the event, in the hallway between sessions, or in the evening at a bar, I’ve made some great connections with folks from around the country who share similar (or dissimilar) testing experiences.

Nighttime at DCA

Looking towards Paradise Pier at Disneys California Adventure

Also: it doesn’t hurt that I’m a Disneyland fan.

Will you be at STARWEST? Let’s connect.

  1. I have no idea why STARWEST is in all capital letters. But it is. 

When Quality Loses

Context: agile development with prioritization and release decisions being made by a product owner.

There’s often a false understanding of software quality (and the responsibility for software quality) in our industry. This falsehood isn’t helped by the “Quality Assurance” job title. With modern development practices, it’s misleading to presume that software testers are responsible for the quality of the released software.

QA as a Quality Advocate

As a software tester, we identify potential changes to the software. Sometimes it might be an obvious bug, where the software is not producing the response that’s clearly expected. Other times we might find potential enhancements such as new features or usability improvements. Either of these categories provide opportunities for improving the software. As a software testing and quality professional, I feel that I have an obligation to suggest that the software could always be better. When quality wins, users will have a better experience, and data will be in a quantifiable better state.

As a tester, I advocate for quality.

Testing != Release Decisions

Ultimately while I advocate for quality in the software I test, the ultimate decision on when to release (given whatever is known – or not known – about the quality of the software) belongs to someone else. In the agile world that’s usually the Product Owner; in other environments it might be a project manager, release manager, or other similar role.

That person – the one making the release decision – is the one who ultimately decides what level of quality is acceptable for a software release. Testers can help inform, but testers can’t insist.

Sometimes, we’ll advocate and our voices will be heard and the quality threshold will be raised prior to release. Sometimes, our voices will fall on deaf ears, or be drowned out by other voices or pressures.

Parked Cars, San Bruno Gas Line Explosion, 2010

The Release Where Quality Loses

When the quality isn’t up to par but the software is released anyway, expected repercussions will possibly and predictably include:

  • increased number of bugs-found-after-release
  • increased number of user support tickets
  • increased number of data or application hotfixes to resolve problems
  • PR or perception problems

Nobody in the development and product teams should be surprised by these results.  Sometimes there’s value in having the software released, even in a state of lessened quality, rather than holding it back to resolve more bugs.  The quality factor is one of many factors weighed in the release decision.  Sometimes quality loses.

As testers, we have to be okay with this, with the caveat that it’s not okay for the product team to blame the testers for the quality level of the product.  While many of us have the misnomer of “quality assurance” in our job titles, we can’t assure the quality when the release and budget decisions are out of our hands.

image via Thomas Hawk; used under Creative Commons licensing

Jonathan Coulton’s Still Alive as a Software Project Restropective

Jonathan CoultonSure, it came from the video game Portal, but Still Alive seems like a hodgepodge of gems that could be used as we look back on a software project. We start with the beginning of the song, referencing a project that went well:

This was a triumph.
I’m making a note here:
It’s hard to overstate my satisfaction.

Sometimes things don’t go so well. Bugs happen:

But there’s no sense crying over every mistake.

I assume this couplet is about a burndown chart:

Now these points of data make a beautiful line.
And we’re out of beta, we’re releasing on time.

Cake is the promised reward in Portal, and hey, who hasn’t met a group of developers motivated by unhealthy baked goods?

Anyway, this cake is great.
It’s so delicious and moist.

And here’s the full song, with the screen as it plays in the game’s final credits:

Image by Flickr user nickstone333, used under Creative Commons licensing

Coming Soon to a #Starwest Near You

Last year was my first time at Starwest, a conference for testers held in Anaheim.

Monorail Orange

As a famous Californaustrian once said, I’ll be back.

I’ll be in Anaheim from the 11th through the 16th, taking workshops from Michael Bolton, Rob Sabourin, and Bob Galin, followed by the main conference.

If you’ll be there, let’s connect!

Unrelated to Starwest, I’m leading a photowalk the evening of the 11th. Hit that link to find out more or register.

Great Expectations

As a software tester, I have great expectations.

  • I expect that as I test a feature that the functionality will match my understanding of the user story and related discussions
  • I expect that the software will have an intuitive user interface
  • I expect that the software will be consistent with itself, with other similar applications we’ve developed, and with industry standards

Sometimes my expectations are met. Sometimes I find that the software behaves differently than (I) expected.

When behavior differs from expectations, have I found a bug? Perhaps. Or perhaps my expectations were wrong.

Conversation Starters

When software behavior differs from my expectation as a tester, more often than not it can be a conversation starter[1] for further discussion. It often means it’s time for a conversation either with the product owner to see if my expectations are in line with his expectations as to the functionality of the system. Maybe it’s time for a conversation with the developer to figure out if her expectations differed when she wrote the code that’s not behaving as I expected.

Some scenarios:

  • I expected the client list screen to be sorted by last name, because hey, that makes sense, right? But perhaps the product owner told the developer that they wanted it sorted by last activity date instead.
  • Perhaps the data field on the screen is allowing for a different sort of input than was noted in the user story. Rather than assuming the developer is incompetent, I can ask if the desired behavior changed beyond a (non-updated) user story.
  • Often, especially with the non-standard use case, I run into an error situation that’s handled in what seems like a strange way. Developers on my team have learned what’s usually coming when I start a conversation with “What would you expect to happen when…” and lay out the scenario. Often I’ve discovered a workflow or use case that hadn’t been foreseen, so my expectation was based on something the developer hadn’t even considered.

Sometimes my expectations are “correct.” Sometimes the desired behavior is different than my expectations.

Expectations lead to revelations which lead to conversations, which may or may not lead to work to change software behavior. News flash: I’m not always right.

Related reading: the Huh? Really? So? section of my notes from James Bach’s workshop at STARWEST last year.

  1. Yes, there will always be the “it’s blatantly broken” bugs, but these aren’t the ones that usually cause process or personality grief.  ↩