{{announcement.body}}
{{announcement.title}}

Quality Sense Podcast: Rob Sabourin — Testing Under Pressure (Part 1)

DZone 's Guide to

Quality Sense Podcast: Rob Sabourin — Testing Under Pressure (Part 1)

Podcast host, Federico Toledo, interviews Robert Sabourin and focuses on consulting, training and professional development in all areas of software engineering.

· Performance Zone ·
Free Resource

This Unique Methodology Will Help You Test Software in Times of Turbulence

Welcome to the Quality Sense Podcast’s first episode! At Abstracta, we are thrilled to explore a new avenue for bringing you fresh content from some of the brightest minds in software!

In the premiere episode, podcast host, Federico Toledo, interviews Robert Sabourin, Adjunct Professor of Software Engineering at McGill University and President of AmiBug.Com, Inc which focuses on consulting, training and professional development in all areas of software engineering.

What’s the Episode About?

As you’ve probably felt yourself, teams are under greater pressure during today’s global health crisis. How can testers adapt to shorter release cycles yet help maintain business continuity? Rob’s developed five principles as part of his Just-in-Time Testing Methodology that holds some of the answers.

Learn why he thinks testers are quite used to turbulence, why purpose is now more important than ever, how to actively search for more context for better testing, and more!

Listen Here


Quality Sense, a Software Testing Podcast · Rob Sabourin (Part 1) – Testing under pressure

Episode Transcript

Federico:

Hello Rob, it’s a pleasure for me to be talking with you again, how are you today?

Rob:

It’s really good to see you, I’m doing great, full of all sorts of energy in the chaotic world that we’re in, I’m surviving. I’ve been at home for 42 days now. That’s interesting. But I’m happy to talk to you and excited to share some ideas with your audience.

Federico:

Thank you so much for participating. I remember that I met you… I think it’s already been two years since we met in Boston at a conference.

Rob:

Yeah, it could be.

Federico:

You gave a workshop on agile test automation that I really enjoyed.

Rob:

You took that?

Federico:

Yeah. It was a lot of fun and also I learned a lot. And something that I really like is that you like sharing your knowledge and you like sharing specific experiences that you’ve had.

Rob:

Right, I find that when you’re doing a course like that, that’s a conference presentation, maybe half a day or a full day, I don’t remember which one it was in Boston.

Federico:

It was two days.

Rob:

Two day one, okay. So that’s a very popular course, that agile test automation course. But I find in a broad survey, to give people a real example of every concept you’re trying to make helps the concept stick. It helps it stick, it’s like Velcro, it sticks. And I know that you can’t map it to your problem necessarily, but if it sticks, then you’ll remember aha, Rob talked about this, and then you can sort of dive in deeper in your own technical and business context.

I also share electronic copies of all my examples and case studies. I just want people to see examples of real stuff, not just theory, not just rhetoric, not just gaba, gaba, gaba.

Federico:

Yeah, I think this is the magic about telling stories, right?

Rob:

Yeah, it’s a lot of storytelling. That’s my style, people know me as a storyteller.

Federico:

And I know you had the chance to visit Uruguay last year, and I missed it.

Rob:

Yeah, you were in California, and I was in Uruguay. It was wonderful. I was invited to give a talk at the university, and I stayed for two weeks, and I visited some of your friends down there. But I basically had a chance to meet some of the testing community, a lot of the university community there, which was really cool. I’m very impressed by the fact that people are doing beautiful graduate work in the area of software engineering and software testing especially, it’s very close to my heart.

I had a little bit of time to see the country, and my wife came and we saw that, I characterize Uruguay now as a collection of places that have beautiful barbecue beef and wines. That’s sort of my memory of it. Lots of beef, lots of wine.

Federico:

So I’m happy that you had a great time there.

Rob:

I felt very welcome, it’s a beautiful place. If anyone has a chance to go, it’s a wonderful part of the world.

Federico:

Thank you, excellent.

So the main topic I wanted to discuss with you today is related to something that I believe that many testers and people working in software companies are facing today because of this lockdown situation, because of COVID-19. Which is basically that companies are thinking of different ideas or pivoting their approach or adding specific features or landing pages or different things in order to find a way to survive, to guarantee business continuity.

Federico:

And probably because of that, many testers are testing under big time restrictions or under pressure. And I know that you had been working on different principles for testing under pressure. So I would like to see how we can apply those principles under these circumstances nowadays.

Rob:

It’s very important to realize that even though today the constraints of our universe are evolving and changing dramatically, still business goes on somehow. Things are still happening. And you’re right, people are making really difficult decisions, what to do and what not to do.

And these decisions land on desks of testers who are now working from home, away from their normal resources and teams. And they’re basically being challenged to deal with turbulence. And that turbulence is change. Lots and lots of change. 

And I like to suggest that dealing with turbulence is something that we in testing are actually inherently good at. But maybe we don’t remember that we’re good at it sometimes. So I urge people to sort of get back to some fundamentals and to think practically. You have skills, you have knowledge and you have experience, and applying it might be outside of your comfort zone, but it’s not outside of your skill zone. So I want to make sure that people are uncomfortable, that’s true, but I think they can succeed. I think they can do really really well.

Federico:

It’s important to remember what we are, and what we know, and try to apply all of this.

Rob:

Exactly, and so I basically came up with a method of testing many, many years ago, called Just-in-Time Testing, this is in the late 1990s that I started publishing about that and sharing my ideas. And people sort of liked it, but they said, “Rob, can you slice it up into chunks, right? Can you slice it?” So I said, “Yeah, if you want.”

A lot of it is testing under pressure when we don’t have time. And I looked at this and I said yeah, there’s never time, and testing is always under pressure.

In fact I used to tell people if you’re testing and you don’t feel that you’re under pressure, what’s wrong with your project?

There’s always pressure. So I came up with what I call the five principles of testing under pressure, and there’s probably more than five, but these are five that I’ve helped people with, I’ve done myself, even two weeks ago, I was doing live virtual training with courses for people in North Carolina, an important software project managing medical data. And they were looking at these things saying, “Yes, we know that.” But they didn’t think of it the way I was sort of presenting it. And it helps people to sort of frame it a little bit differently.

If I can share them with you, that would be probably the best thing-

Federico:

Please.

Rob:

My principles for testing under pressure: One, purposeful testing. Two, active context listening. Three, flexible decision-making. Four, triage, ruthless triage. Five, always know the last best build. These are five names of principles. 

My principles for testing under pressure: One, purposeful testing. Two, active context listening. Three, flexible decision-making. Four, triage, ruthless triage. Five, always know the last best build. 

Rob Sabourin

But each one of them has an incredible bearing on the ability of a tester to work under extremely harsh contexts. Like literally you walk into the office and you’re in a different business. Like you were developing web software on Monday and on Tuesday you’re doing mobile devices. And on Thursday, it’s microservices, and it’s all changing, before you finish one thing, the whole project changes. How do you deal with that?

And I would like to say you can, you can, but it’s a question of being able to apply your skills in these contexts. And of course for each of these principles, I have all sorts of stories and experiences and stuff like that. But if I was doing things in the sort of COVID-19 universe, I would probably start with the first one, purposeful testing. What is the reason for the project, what is the reason for the change? If you don’t know the reason for the change, if you don’t know the motivation for the change, how can you decide what to test and what not to test?

Edsger Wybe Dijkstra in the 1960s demonstrated clearly that testing is intractable, that there’s no way you can demonstrate a product has no bugs in it by testing. There’s always another test, there’s an infinite number of tests for even the simplest application. So you always don’t have enough time to test everything. You always have too much to test and not enough time.

How do you decide what to test, what not to test? First thing is purpose, why are we doing the project? If I was looking at something and someone threw it at me, and I had only one question I could ask my stakeholders, it would be why are we doing this? And if you know why, then you could help answer the question, does this product deliver on the why? If you don’t know the why, you’re just looking at menus, docs, controls, APIs. You could waste hours testing things that don’t matter. I don’t have time to test something that doesn’t matter, so I want to know why I am testing.

Federico:

I really like this, and I think it applies also for any activity you do, because if you-

Rob:

Almost anything in life, right?

Federico:

Yes, that’s true. But thinking specifically in software development, it’s like if you’re a programmer, if you’re programming something, and you don’t know why you’re doing that, it’s the same thing, right?

Rob:

And when you look at failed projects, people blame things like onshore, offshore, chaos, communication problems. Most of it is not communications, it’s purpose problems. On time, on quality, on budget is meaningless unless you’re on purpose.

In chaotic turbulent circumstances, what I’ve learned is purpose can help guide me, purpose can tell me what our stakeholders want to learn about. Purpose can help me decide between two things, should I test A or B? It helps me decide that. It doesn’t give me the answer, it doesn’t tell me what exact test case to do, that’s my skills that are going to help me do that. But it gives me guidance to what is important.

And the purposeful testing has to do with a lot of factors, one of the most important ones as you know is what is quality? People are usually asking me when they’re asking me to test something, they’re saying give me a judgment or stateething of quality, I want to learn about quality. Well, what is quality? Quality could be a lot of different things. 

There’re so many views of quality, and if you don’t know the view of quality of your stakeholders, then you may deliver amazing tests and test results that are useless to your stakeholder. Yeah, they’ll put it in a box, they’ll put it in a binder, they’ll put it in a spreadsheet, they’ll put it in a dashboard, who cares if it’s on a dashboard? It’s a shiny light. What matters is: Are we delivering information about the stakeholder’s view of quality?

And I will argue that a popular view of quality used to be, still is pretty much, conformance to requirements, right? Quality is conformance to requirements, that’s a total quality movement view. It’s also part of the ISTQB view of quality, it’s part of IEEE view of quality, it’s in many places.

And another view of quality goes back to the work of Joseph Juran in the 1950s:

If quality is not conformance to requirements, quality is suitability to purpose. So conformance to requirements is one view of quality, suitability to purpose is another view of quality, and they’re not the same.

ROB SABOURIN

If you care about suitability to purpose, you’re going to model your test around what the user does. If it’s conformance to requirements, you’re going to model your test around what the software does. That’s two different models. Neither are right or wrong, it’s the stakeholder that matters.

So what superpower do you want to use, what skill do you want to use? Ability to elicit information from stakeholders about quality, what matters. And that helps you more than anything. That’s the first principle.

Federico:

That’s really cool, and you’re making me think about a really short project we just participated in, because also of this COVID-19, there were a bunch of companies in Uruguay who collaborated in building, volunteering mainly, doing some performance testing. We built an application for tracking the different COVID-19 cases, and they are publishing that information, the official government information through the app. And it was amazing to see different companies that haven’t collaborated before, in that very short period of time, were able to release an application, that it’s working.

And I think one of the main reasons why this could work, was because there was a very strong motive, a very strong motivation behind that.

Rob:

A sense of purpose.

Federico:

Yes, totally. So I really like what you mentioned.

Rob:

Well, that’s the beginning. Without that, nothing else matters, if you accept my opinion. Okay?

Federico:

Yeah, it’s like the initial-

Rob:

I don’t impose it, but I believe it’s true.

Federico:

The initial feel, right?

Rob:

Yeah, but there’s more, there’s more. And the second principal is active context listening. When you are faced with turbulence and a lot of things are changing around you, certainly tools of testing and software engineering, like traceability, can help you, right? If this traces to that, if something changes, it helps guide you. And regression testing helps you build confidence, but there’s nobody in your company whose job is to tell you when a factor changes, that influences how you should react to these things. Active context listening puts the onus on the tester to look for changes in context, to hunt them down, to watch for them, and then to say, “Okay, now how we’re working might be different because a business technical, organizational or cultural factor around the project has changed.”

And if that’s changed, then the focus and scope of testing probably has changed. I don’t have time when I’m working under pressure to test the wrong thing. I want to test the right thing. Just because I started testing it yesterday doesn’t mean I have to finish testing it today. So I like to listen and figure out ways to learn about context, and I found that a lot of testers know this, but they make the mistake of expecting somebody to go and give you the magical information. There’s no one who’s paid to do that! You’re paid to test the software, you should be part of what your job is, is to learn, to stick your nose everywhere to learn what are these factors.

And when you start, you might think it’s silly a bit, but after a while, you’ll see that by knowing who to talk to learn about factors, like you might talk to salesman or you might talk to the accounting department, or you might talk to the production department, or the DevOps guys, I don’t know who it is, but somebody knows the factors. And these factors influence what matters. 

These are called context factors, and context listening, and I know testers have the skill of changing testing if the context factors change. But what they don’t often realize is it’s their job to look for the change. Really, not just be reactive. Be proactive. It’s not like we’re working under turbulence, we have to get something shipped. It’s not going to wait for you to do a big analysis, and nobody’s job is to tell you. So you have work and learn.

And over time, you’ll learn who these people are so that when you’re under pressure under the gun, you’d be able to use that information. I don’t think that it takes five minutes to learn who to get contacts from. But I think if with time you build a nice catalog of people, and eventually you start learning how to learn about context. And you’re starting to do things like looking at literally the sales funnel or something like that, some object in the company that is not really for technical purposes, but that’s going to guide you into what’s important. So that’s active context listening.

Federico:

And it’s mainly listening to the right people, right?

Rob:

The right people, and they don’t even know that they’re telling you something that’s useful for testing, because they’re doing their job, they don’t have on their job description tell testing about this. They don’t have that. But if I talk to them, and they’ll say, “Hell, yeah, hi Rob, here’s what’s happening.” I say, “Oh wow, this is happening.” And then even if you know deployments of customers, you could change the priority of things. Different customers use different features in different ways, if you only have a few hours to test, what are u going to do? You’ll look at the people who are going to be using the actual software, not the people who aren’t using it.

And the people who have this knowledge are usually contract departments, not in technical departments. It’s not the programmers who will tell you that. But a programmer isn’t going to tell you what could break.

Federico:

Yeah, that’s true. There are two different skills that are associated with this, one is listening and the other is asking good questions.

Rob:

Good questions, now you’re nailing it, because that’s the first, it’s the same thing with purpose, same thing with active context listening, but the best testers, and you’ve worked with a lot of testers, I’ve worked with a lot of testers, the best testers are people who can learn. And learning is really what we’re doing when we’re testing, in my opinion, and it’s about asking questions. So that’s the underlying skill that I’m sure experienced testers have it, just use it. 

So context is for me, the COVID-19 thing is all about context changing right now. How we do business is changing, there’s some things that we used to do that we don’t do anymore, so if you don’t do it anymore, and you have a time to test A or B, and we’re not doing B anymore, well, I would figure out a way to test B lighter, shallower then, I would test A more thoroughly. And I’d figure out a way to balance the scope and depth of testing based on the realities of today.

Now maybe in five weeks when things go back to normal or whatever it is, maybe these are going to matter again, but between now and then, even after it’s deployed, we can still test it.

No one said you have to stop testing just because you deployed the software. It’s just risks you’re learning about after you’ve deployed it. 

Rob Sabourin

Federico:

Yeah, and probably this makes more sense now, more relevance-

Rob:

For some projects, it does today.

Federico:

… but mainly because the context is changing very drastically from one day to the other.

Rob:

Every day. Crazy change.

Federico:

Yeah, there is a lot of new information or different sources, and you also have to distinguish which source of information is good and which one is not.

Rob:

That’s it, but that’s where you get the notion of active listening is what is your trusted source of information, and can you get closer to the source than the interpretation of it? If we have a lot of time and you have a wonderful project and you have a test manager with two months to analyze things, okay, maybe it doesn’t matter. But I’m talking about the real world here, turbulence, someone’s throwing it at you at the last minute, what do you do? You don’t have the whole phase for analysis, you have to get going. So this is where context listening is critical.

Federico:

Cool.

Topics:
agile testing, coronavirus, interview, performance, podcast, software quality, software testing

Published at DZone with permission of Kalei White . See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}