Deploy small, deploy often.

I recently a part of a production deployment.  It is not like a deployment of the early 2000s where everything went up and you expected it to work.  No.  It was deployed in parts because, well, we forgot stuff and we discovered stuff.

We deployed the product and started it up.  We discovered something we didn’t expect.  In the early 2000s, we would have executed the back out plan and deployed another day (a delay in business value).  However, this is 2019.  We made a correction in production and tried again.  We did this several times until we finally had the entire product operating as expected.  Early reviews of its functions say that the product was even operating successfully (read: providing business value now).  I love testing in production!

Could we have had a better plan for deployment?  Yes!  Based on this experience, our next deployment will be better.  Ain’t that being agile?

Did we deploy the product that was approved by multiple interested parties?  Hell no!  We deployed something that demonstrated excellent functionality in a non-production environment.  When we deployed to production, we solved issues like forgotten configurations, forgotten files, and an unexpected data format.  That is, we LEARNED a lot about our product in another environment, corrected our mistakes and misses, and got it working.  When it started working, then the real learning started!

We discovered some things our product could not do because of data quality (by the way, our error handling caught these!).  These will be addressed in a future deployment.  We discovered some functionality that needs tweaking.  This will be addressed in a future deployment.

Going forward, we deploy small and deploy often.  We will learn as we did above because it is soooo valuable!  Testing in production is not a crime, it is a learning experience!  If you can, go forth and learn!


Here’s a New Idea You’ll Love!

Think about the last time you heard an idea or suggestion that was extremely, violently, and orthogonal to how you think, believe, or live.  Of course your reaction to it was welcoming, engagingly curious, and you had a deep, deep desire to try it.

Probably not.

I see this occasionally, as many testers might, when I suggest alternatives to existing or established testing practices.  In my experience, the reaction is usually negative.  Occasionally, there is so much push back that surmounting just the negativity would be a day’s work.

My approach has been to keep trying because I hope, perhaps a quixotic hope, that one day they may ask the key question.  I listen for this question every time I make my wild suggestions.  I hope, just once, they would ask how.

“Joe, how would that crazy idea be possible?”

For now, I wait until they are ready.  Forcing the idea and my notions of implementing it would waste time and drive them away.

How have you approach the suggestion of your crazy ideas?

The New QA

Our applications are growing larger and more complex in business functionality, and they are available on multiple platforms.  Our business partners expect more functionality with better quality delivered in a timely fashion while our technical leaders attempt to simplify designs and make it all possible.

The ability to deliver value in an application AND making it simple to use challenges experience designers, developers, analysts, and especially testers.  An expectation to evaluate application behaviors in depth through just the user interface has not only become a fallacy, it has become insufficient and risky.

High profile projects in your organization must be delivered quickly and as a tester, you cannot wait for completed code to begin testing, and you cannot assure quality merely by testing through a UI.  The definition of testing must extend beyond exercising code to evaluate requirements and risks.  To expect high quality products at a good pace, you must foster and nourish quality as a team sport.  Your time is now and project teams are ready to hear you!

Testing now begins when work is assigned.  You are the Question Asker and you scrutinize acceptance criteria.
Soon after the assignment of a story card (or your method of work conveyance upon your team), the Three Amigos is your first opportunity to foster team quality, your first opportunity to “test”.  Your tests include

  • A check for acceptance criteria clarity
    • Does everyone have the same, deep understanding of WHAT is being built?
    • Have those understandings been verbally explored and vetted?
    • Have you vocalized your assumptions and had them clarified?
  • A check for validity
    • Is this product being constructed at the right time?
    • Are there dependencies that could prevent its timely completion?
  • A check for value
    • Some products are defined at some time in the past – does the present context of the project still support the work defined by this card?
    • Will the completed work still be valuable to the project and the product?

Testing continues as products are designed.  You are the Quality Advocate for testability in designs and implementations.
Most products require some thought before their construction.  The project team explores possibilities for implementation during design meetings.  Your testing continues at these meetings.  While you could also attend design reviews, a design review may imply that the design has been decided – this is too late.
When the design of a product is discussed, your primary goal is the testability of the product.  If the design addresses just the business intent of the product, then it is not testable.  As a tester, advocate for testability.  Make a request for error logging, for transaction transparency, and for data transparency.  An introspective product is a testable product.

During implementation, paired programming becomes paired collaboration.  You are the Quality Accomplice with simple tests at the ready and encouraging reflection on how the product in front of you will interact with the system.
Quality must be built into a product: the product is defined in a manner that is clear to everyone, the product is designed so that inspection is simple, and the product is constructed and checked multiple times during its implementation.  As a tester, you are the Quality Accomplice by having tests ready for execution at any time, and by working with the developer to help them create a quality product that meets the business intent.  To work effectively with a developer, you must understand the system and its components, and how the components interact to provide value.  With that understanding, you can ask questions how the code in component A will interact with the code in component B under multiple scenarios.  This collaboration helps everyone reflect on the integration of small parts defined in story card and their contribution to a product that operates correctly in multiple scenarios.

I encourage you to go forth and experiment with these ideas.



A technical tip all testers should know

A Ministry of Testing Club post suggested we pen some thoughts around the topic title above.  In addition, it encourages us to blog for the first time or to start again.


Technical Tip: Never Believe What You See on the Screen

Your user interface will, occasionally, lie to you.  You can’t trust it.  The information you see on your screen will suffer from staleness, subtle misspellings, misplacement, absence, and other challenges that infect the journey of data from where it originates to where it’s displayed.

You enter data, run the transaction, and the result is 14.02.  You enter different data, run the transaction again, and the result is 14.02.  It’s probably a defect.  Verify the result at its source!  Maybe the database was not updated, or maybe the code that moves the result from memory to the screen didn’t execute or executed incorrectly.  Collaborate with your developer to find the answer!

Subtle Missppellings
Many times the text that appears on the screen was copied from requirements.  Who cares if the requirements are spelled wrong?  You can’t beleive how often this happens but often enough that I check the spelling of text destined for screen when it appears in the requirements.  When you find spelling errors in requirements, request to have them corrected and prevent this very annoying defect!  By the way, did you find both spelling errors in this paragraph?

When a paragraph or image appears in an odd place on the page, it’s easy to see.  When the placement is off by a few pixels, it’s much more challenging.  While a small difference might impact the page by moving text or other images, page construction can be complex.  I recommend having both a sample of the page from your User Experience designers and the page constructed by the program available on your machine.  Move back and forth between the two using Alt-Tab to look for those subtle differences.

You complete the transaction and expect to see your result on the page.  Instead, you see nothing.  The developer checks the code and verifies that it should appear.  It’s possible that your result has been placed in a position located off the screen.  The program is only too happy to place your text past the edge of your screen – after all the position is just a number to the computer.  It knows nothing of screen size or boundaries.  This is especially true when positions are calculated and absolute.  You might use the developer tools to verify the text is a part of the page, and then collaborate with your developer to experiment with positioning.

No Limits #1

I made a wiring error that left this power supply useless.


I started from scratch to build this replacement.


Cool!  Now, build another one ’cause I need two.  That are almost alike.


This one needs a wicked heat sink.


How will all those components fit on this board?  Perhaps a different placement of the heat sink?  It seems I need more space.


I don’t have to use the original board.

When you think, believe, act, or talk with limits to methods, behaviors, or actions, you also limit your possible solutions.  See your world as if it always has fuzzy borders, as if the edges of your vision fades to a cloudy background full of possibilities, as if every challenge has multiple solutions.

As a tester, I am challenged frequently to scrutinize gherkin that describe what to build, software that was built using both knowledge, experience, and interpretation, and, most challenging of all, suggestions on what and how to test.

You start by stating you see things differently.  Engage your peers in a conversation to nourish your gherkin into robust examples, to explore new and changed code frequently, to understand alternative viewpoints on what others want to learn about product behaviors.

Then, challenge (respectfully but with assertiveness).  I say challenge because sometimes products are not as complex or interrelated as they seem, because a simpler test may lead to adventurous exploring, because you need to use your time wisely, and because the succeeding conversation may uncover the assumptions that have become hidden with familiarity or painful with production issues.

Uncover the perceived limits and unlock many potential solutions.

It’s almost bullying

We’ve heard it many times and we’ve been told many times.  Too many times.  Too many times testers believe when they are told “…there’s nothing to test.”  It’s that line used to brush a Tester aside or to subtly invite the Tester to leave.  I believe we reconsider the phrase and take back control of what we, as Testers, believe the testing opportunities are.

When you’re told there is nothing to test, close the card.  Stop the execution.  Find another card for that developer.  When they say there is nothing to test, they are telling you they have no work to perform on the story card.  They’re probably gold plating existing code and don’t want to be discovered.  It’s a no-work Iteration for them.

It might be worse.  When I hear that phrase, it’s as if they are suggesting I’m too, shall we say, inexperienced to understand how their work might be tested.  Why aren’t we insulted?  I think it is borderline bullying.  Who are they to determine what can or cannot be tested?  You’re the tester!

Lastly, let’s turns the tables on them.  When you hear that phrase, you might suggest that the complexity of the story card might be beyond their capabilities.  Something like “…you know, the work defined for the story card is pretty complex.  Maybe YOU aren’t the right person for the card.  There is nothing for YOU to code.”

You are an equal member of the project team and deserve respect for asking about the testing opportunities for any work defined in your project.  Asking questions is part of your job and part of your responsibility.  Question the testability.  Question the design.  Question the implementation.  Challenge their opinion of testing opportunities and change their opinion of you.


It’s About Information

What the hell?  I’m getting pressure to make a date?  To be clear, this is the date we discussed about six weeks ago, made assumptions about some tasks, made assumptions about availability, and made assumptions on what we knew.  Since then, tasks have required more duration because people are providing end-of-year feedback, we decided to have some people take training in preparation for the next release, and the vendor’s system has some behaviors we didn’t expect.

I should point out, because as a tester it is my responsibility to provide information, that our performance tests are incomplete (we have not executed the most important one), our production verification utility won’t be designed until the week before the date, and part our solution has some inconsistent behavior.

I’m satisfied if the former is acknowledged and the latter is understood because, as a tester, it is not my responsibility to decide the next step.