HatchWorks logo.

June 27, 2023

Quality-Driven Product Development with Realtor.com’s Erika Chestnut

In this episode of the Built Right podcast, we look at the often overlooked and undervalued topic of quality in software development and how good process and culture are what creates the foundation for it.  

Joining us is women-in-tech career coach, Erika Chestnut, who is Head of Quality at Realtor.com. Erika has been building and leading quality teams for around 15 years. She has a wealth of knowledge to share about the foundations of good quality, why organizations that want to improve quality are often focused on the wrong thing, how you create a balance between quality and innovation and the good leading indicators in quality.  

Keep reading for some takeaways from the episode or check out the full discussion below. 

Many organizations focus on the wrong things when trying to improve their quality, primarily considering testing as the sole determining factor, explains Erika. However, true quality in software development starts much earlier in the process.  

 

Shifting focus: moving quality up the chain  

According to Erika, when people think about quality, they think about the end state and therefore they land on the thing that happened right before the end state, which oftentimes is testing.  

However, she explains that quality should be addressed throughout the entire process. Shift left testing, as referred to in the industry, means moving the validation, checks and awareness further up the value stream.  

It involves examining your processes and the impact they are having on product quality. 

Erika likes to say, “good process creates quality, good process results in quality.” Because processes create consistency and continuity, which results in quality. 

While testing requirements, are you also checking:  

  • Your process 
  • Your communication flow 
  • Your documentation 
  • That everyone has what they need 

 

By expanding the scope of quality beyond testing, organizations can address critical factors that are impacting the end result.  

 

Quality everywhere: Process, Tools and More  

Erika emphasizes how opportunities to lead with quality are everywhere.  

It’s a case of asking: can we improve quality in our processes? Can we improve the tools that we use and how we leverage them? Are the tools implemented in a way that’s cohesive and integrates into our system in a meaningful and impactful way? 

Every opportunity within the development lifecycle should be considered to enhance quality.  

 

Common pitfalls and opportunities for improvement  

When asked about a common pitfall that Erika witnesses time and time again, she hits on one of her biggest pet peeves, which is the lack of clarity regarding the business structure and flow. She emphasizes that understanding the structure and flow of the business is fundamental for ensuring quality. 

Without a clear big picture view, teams may become blinkered within their own domains, missing crucial integration points.  

Leaders need to communicate the organization’s structure, product flow and internal narratives, enabling teams to grasp the interconnectedness of their work.  

This type of awareness fosters better collaboration and a holistic understanding of how each team contributes to overall quality.  

 

Balancing quality and innovation  

Finding a balance between quality and innovation can be challenging for companies, Erika explains.  

While innovation is often the top priority, focusing solely on it can jeopardize quality.  

Rapidly introducing new features or functionality without addressing underlying issues can lead to long-term problems.  

Erika emphasizes the need to consider the impact of innovation on quality. Monitoring leading indicators such as defect density, release health, rollbacks and time between failures helps identify if innovation is having a negative effect on quality. 

 

Recognizing the value of process and quality  

Helping businesses recognize the value and impact of process and quality is crucial before issues arise.  

Erika advises aligning the story of quality with the goals and interests of the business to create a compelling narrative. By connecting the dots between process improvement, quality enhancement and business outcomes, leaders can appreciate the significance of investing in quality.  

Analyzing metrics defect density, release health and customer feedback serves as tangible evidence of the impact of process and quality initiatives.  

This approach can help foster a culture of quality from the top down, ensuring that process improvements receive the attention they deserve.  

 

To hear more about quality in software development, tune into the full episode today.  

Subscribe to Built Right for more engaging conversations that will help you build the right products the right way! 

Matt Paige: Today we’re chatting with Erica Chestnut, a true champion of quality. She’s been building and leading quality teams for around 15 years now. At places like Cabbage Turner broadcasting calendly realtor.com to name a few, and that’s an awesome list there. By the way, Erica and before that she’s led development team. She was even a developer herself, so I know you’re gonna fit right in with our built Wright community here. But welcome to the show, Erica.

Erika Chestnut: Thanks. Thanks for having me. 

Matt: Yeah, excited to get into this topic. This is one we haven’t gotten into yet on the Built Right podcast, but. Today we’re getting into the often overlooked and sometimes undervalued topic of quality in software development. And the topic of why good process and culture are really at the foundation of good quality. And PS, for everybody listening, stick around. We’re gonna get Erica’s take on her perspective of generative AI and how it’s impacting the discipline of quality. I know everybody’s talking about it, so we want to get Erica’s take on that as well. But to set up the problem, Erica, you talk about organizations who want to improve their quality are often focused on the wrong thing. So what is this wrong thing and what can they do about it? 

Erika: Yeah. Quality is always, not always that’s a poor statement. Oftentimes, when people think about quality, they think about the end state and therefore they think about the thing that happened right before the end state. Which is oftentimes testing. And so when they say our quality is not good, they say our testing is not good, or we are not investing in the right type of testing, i.. manual versus automation, or we don’t have enough coverage. We don’t have enough code coverage, we don’t have enough functional or non-functional testing. But the reality is actually that it starts much further up the stream. And you started to hear about this when we, when the industry was like shift left with testing, but then just like most buzzwords, right? Innovation, innovative, right? Like it, yeah. It’s not unpacked. And so now it’s like shift left testing. Okay what does that genuinely mean? And what is the impact of that? 

Matt: And real quick shift left testing, that’s meaning moving quality further up the value stream towards more than the beginning of the process. Is that right? For just for listeners?

Erika: Yes. But it’s not moving quality up, it’s moving. The it is moving quality up, but it’s really about moving the validation, the checks the awareness. Yeah, what is impacting our product quality? And so one thing that I always love to say is that process creates quality process, results in quality, good process, because process creates consistency and continuity, which results in quality. So when you say moving quality left or further up the chain. People are still thinking testing. Oh, we’re testing the requirements. Are you checking your process? Are you checking your communication flow? Are you checking your documentation? Does everybody have what they need? Are you checking to make sure that the quality team is not starting the new sprint at a deficit? Because the engineers didn’t start stopping before they started finishing, right? Like you, you’ve gotta. Shift the idea of what impacts quality, what creates poor quality. And it’s not just testing. 

Matt: Yeah. And you make a good point cuz quality is at the end for all instance of purposes. That’s the last thing. Let’s check everything, make sure it’s good to go. And a lot of times they can be the scapegoat when something goes wrong or doesn’t get right. Delivered. And I love how you hit on this concept, the process, but to clarify a lot of people think like process, they think like tools, but it’s not about the tools. Think tools are often, over, put on a pedestal in terms of, oh, they’ll fix everything, but it’s not about the tools. It’s the underlying pieces in the process. And I love how you talk about the culture element that comes into play. As well. Yeah, 

Erika: it’s, and there’s definitely tools and it’s testing that’s at the end. It’s the testing, it’s quality is the entire thing. So actually I have to retract my statement. It’s not that it’s not about moving quality. It is it’s. Quality doesn’t need to be moved. Quality is everywhere, right? Yeah. The opportunities to lead with quality are everywhere, so it’s not about moving it left or right or up or down it’s about acknowledging that there are opportunities to improve quality in everything. Is it improve quality in our process? Is it improve quality in the tools that we use and how we leverage them? Are they the right tools? Are they answering the right question? Are they implemented in a way that it’s cohesive, they’re not cohesive, that it integrates into our system in a meaningful and impactful way. All of that is quality. All of that produces quality at the end state. And they all come together like they’re, it’s not just testing. It all comes together to produce quality. 

Matt: Yeah. So to make this more real, I am curious cuz you’ve been in a lot of. Interesting companies from small to large, and I know you do some like side consulting stuff in the past, but what examples do you see? What are those common pitfalls that companies have, whether it be process related or just in quality in general? Is there anything that’s I see this every time, or this is like a big thing that typically happens a lot? 

Erika: One thing that I see this always, it’s a It’s, I think a pain point for not a pain point. It’s, yeah, it’s a pain point for me. Is that Or a pet peeve? That’s actually a better word. There you go. Pet peeve. People do, it’s a pet peeve of mine and I see it all the time. The structure of the business is not clear. And it’s fundamental quality opportunity that is missed. When it’s, The structure of the business is not clear to the teams or the business flow, like the whole, like what is it, what are the boxes that make up the business and how does it flow left to right? What are the, what are the little exits along the way? And what happens is not. Fully unpacked for the team. And then when we go through like hiring companies are going through this massive hiring these windows, and then we’re throwing people in and we’re saying, Hey listen, go to your team. They’ll help you. The team has blinders on, the team has blinders on, and they’re like, this is our little world, but we’re not providing this big picture view. For people to understand at the top level this is our business. This is our structure, this is how we talk about ourselves internally. And this is very clearly how it moves down into the organization. From a structure and from a business business flow. Like the actual product. And so I find that’s, those are missed opportunities oftentimes. And they don’t recognize, leadership doesn’t recognize that, that it’s impacting quality, and I’ll go into teams and I’m talking to teams and they’re like, I don’t know about this. I don’t know how this integrates with this other system. I don’t know. I had one, one manager say, my area, my enterprise, Area doesn’t integrate with this other area, these other, like this main area of our product. It does. It did. And they didn’t know it. So like we put on these blinders and you’re like, Hey, I’ve got my area and I’m good. It’s but are you thinking about how your area integrates with these other areas and what the impact is and do you understand and are you mindful of that? Yeah, that’s, so that’s the thing that I think it’s missed.

Matt: That’s interesting, especially I guess when you get into larger scaled organizations. But it gets back to we talk about a lot about at HatchWorks connecting to the outcome and understanding the outcome and knowing that in all layers of the organization. Yeah. It’s so important cuz you have to understand, what is the business outcome trying to be achieved. But I love your point around the connection between multiple teams and having that, yeah. Having that quality understanding between the different organizations. Quality and innovation. In my mind, I feel like they can sometimes be at odds. Quality is very much process driven rigids not the right term, but you do want like foundational process in how things are structured and then on in the innovation side, a lot of times, whether it’s like business model innovation or anything like that, you’re thinking of breaking process and norms. How do these two play together and how do you create balance between quality and innovation? Those two companies? Yeah. 

Erika: Most companies struggle with that as well. The, that, that balance of quality to innovation because obviously the business is running after innovation. That’s the, they wanna stay ahead in the market. They want to be first to make that next big change. They want to be the unicorn in this space. To do that. Sometimes you’re running fast and you are focused on what’s the new Wizbang feature that you have, but that can be a struggle. It can it, it can be at the expense of quality and if we’re not looking at it, If we don’t pay attention, we’re like, listen just innovate. Get these new products out, get this out. And you might have the teams, you might have the quality teams saying, there’s a problem with our quality. There’s a there’s a problem with our architecture and we’re building these new features on top of it. And so these new features are nice and shiny. But we’re putting them on top of something that doesn’t smell so great. And eventually the new shiny thing will wilt and it will also smell right, because we are not we’re not considering that we never fixed the actual problem. We never cleaned up. The smelly stuff, right? Yes. And that’s part of quality, but it’s not just bugs. It’s what might create delivery problems. What might create inefficiencies? What, what doesn’t allow us to roll back quickly if we have problems? How long does a problem linger out there? How many open issues do we have? Just. Even just meeting acceptance criteria. The turn of how frequently, how long it takes to get something delivered, and then we’re making. We’re taking shortcuts because the requirements weren’t 100% clear. And so we had to go talk to product a lot. And then we went back and forth and we made changes. And then all of a sudden, something that we developed two weeks ago that was actually pretty well baked has now been hacked at the very end, and were released out there. And then there’s an edge case that we didn’t know about, but it’s like an extreme edge case, right? So it’s that innovation like when you think, are we innovating too quickly? Over quality. What is the impact of qual impact of our innovation quality? Did we release these new features that we, this new functionality, there’s new innovation, and do we see a high level of defects? Did do, did our defect density increase? Did our CSAC scores go down because our customers are like, Ew, this is broken. You told me about this new hotness and now I’m coming here and it’s just broken. That sucks. I don’t wanna use your product anymore. I don’t wanna tell somebody else about your product. Right? We have to balance that, but it’s oftentimes a struggle. 

Matt: It’s almost like quality in a lot of ways is the enabler for innovation. If you don’t have that foundation set, yes. It makes innovation that much more difficult, to actually. Really do that. And I love the, I got a visual in my head. I have a, a one year old baby. So when you mentioned the didn’t smell so great. That’s bringing up some bad memories from last night. Things were thrown away. I don’t want to get into it. But the you hit on some other things though. In our business, and we’ve done kind of a foundational shift as of late, really focusing on what are our leading indicators. Versus our lagging indicators, like what are some good leading indicators in quality? And you mentioned, I think like some time to resolution and things like that. What are you looking at whether, lagging or leading that are indicative of either, things are going good or maybe I need to like hone in a certain area.

Erika: Yeah. That, like that defect density, right? What’s, what is our release health look like and do we see a lot of releases that are going out and we’re seeing our health dip? Are we seeing a lot of releases returned? A lot of rollbacks, reverts incidents. What’s our mean time between failures in production, right? These are all alarms. These are red flags that we can look at and say, maybe we’re innovating too quickly. Maybe. Maybe we need to slow down what’s causing this. Maybe we need to look like we’re our requirements not fully baked. Were our acceptance criteria not clear? Where was the failure? Did you know? Did we push in something really late that increased defects? Did we not? Is it, how, what type of failure is happening? Is it a backend failure? Is it a load capacity issue? These are all things that, like, when we begin to unpack that and we say, hold on, we’re seeing an increase here. Let’s look at it and understand what the problem is so that we can target it, fix it, and then go fast again. But often times they don’t. That’s such a foundational. 

Matt: Yeah. Yeah. That’s a foundational piece is knowing what those metrics are. So you have your, Dash dashboard, for lack of a better term of your indicators. Yeah. And then when something’s off, you know where to dig into. And I heard you mentioned the the defect density. Is that just like volume of defects or is it hitting on something 

Erika: more specifically? Yeah it’s volume of defects. So let’s say that, we’ve identified. 200 defects in production. And please don’t start me math because math is hard. So we’ve identified 200 productions. 

Matt: No it’s Friday for us. We’re not getting into math.

Erika: It’s Friday and it’s been a Friday. Yeah. Good. Not in a Margaritaville kind of way, although maybe it needs to be very soon. 

Matt: Yeah. That’s next. 

Erika: Yeah. But Like the number of defects. And then let’s say that we, we, we have a trend. We see that we have, maybe some spikes here and there. But we start to recognize that those spikes are happening every time we release to production. That means we’re recognizing that we are introducing in every release a spike of issues that then we’re having to work back down. How do we improve that spike? Is there a correlation? And then what is causing that? What are we missing? Do we not have enough automation regression? Are these regression issues? Do you know? Do we not understand our system well enough that we understand the impact of the changes that we’re making on downstream areas of the system? What is creating that spike, which is costly? Because especially if it’s like a critical area of the system generates an incident, you’ve got no less than 10 people jumping into that conversation. You’ve got the eye of the cto, the eye of the cmo, so you’ve got executive leaders and you’ve got senior leadership. This gets really expensive and they’re just looking at it and waiting. And they’re jumping in, they’re engaging in the conversation. And then you’ve got. The management, the middle management layer, and then you’ve got the ICS that are implementing potentially the ch, you just got a lot of people in that. It’s costly. And so it’s Hey, we’re seeing, we release and we spike, and then we spend on top of the innovation time to get those spikes back down, or we’re leaving them out there and the customers begin to deal with death by a thousand cuts because, oh, it wasn’t a big issue. But there are a thousand of them that everywhere that comes like gnats, right? And you’re like, yeah. Like walking into a 5,000 bugs. 

Matt: Yeah. I went to school down in, in south Georgia, in Georgia Southern. So I’m used to the gnats. You can probably sympathize being Atlanta East in. 

Erika: Can I get around you? 

Matt: Yes. That’s a good point though. You talk about, it’s like how do you help the business recognize the value and the impact? Of process and quality. Yeah. And it’s almost like it, it’s before it’s too late, I think is the key thing. It’s like, how do you help them recognize that value? What is, where have you found success or what are some good things to hone in on to help connect it to business value before it is too late? And then you got, yeah, all the C-suite breathing down your neck, like you mentioned as a scenario nobody wants. 

Erika: It’s telling that it’s identifying the story of quality within your organization. So hearing from, I love to, like, when I come into an organization, I really want to hear I hold what I call my what? The bug meetings, and I’m meeting with different people. I’m asking them some similar questions depending on where they are level wise. Some are a little bit, more detailed conver questions, some are more strategic but they’re still in the same vein. And then I’m looking for those categories. I’m looking for the sentiment and the conversation. I’m looking for the themes to surface to help understand where are the problems. Because the thing about telling a story is you want it to be compelling. You want it to be interesting. We’ve all picked up a book before and gotten, maybe a chapter or two in, or watched a new series and got into the second. Second half, halfway through the second episode and was like, this just isn’t my jam. The story of quality is no different. You have to tell a compelling story. You have to explain it in a way that attaches and connects to the business heart, to what the, what leadership is interested in. To the value of the business, which is the customer, which is our revenue. You’ve gotta connect it into that conversation, and that takes time. That that, yeah, that requires a lot of moving parts and pieces, but when you understand. The sentiments. When you get that feedback, you’re at least able to say, Ooh you’re worried about availability, or, ooh, you’re worried about, SEO tracking, or, you wanna understand our customer sentiment. Okay how can I get that and surface that information? How can I make that visible through the lens of quality and say, Hey, listen, we’re tracking this. And we wanna hold the teams accountable to it and start to drive that conversation. So you’re taking the heart of what the business is interested in, and you’re moving it through the lens of quality and pushing it back to the teams to say, this is something that we need to look at. How are you improve? How are you helping to improve this? 

Matt: You’re like a quality marketer. You, it’s, I promise that’s one of the reasons you’ve been so successful in your career is being able to connect that story. That’s so cool. I love that. 

Erika: It is not an easy thing to do. I don’t tell you. No, but it’s interesting. It’s not, it’s interesting, but people don’t always think, the thing, the interesting thing though is the frustrating thing is that it’s not all explaining that I have to go through that. I can’t it’s not. Common. It’s not a common expectation. And so yeah, I’m like wandering around sometimes thinking, what data do you have? What? What is the data and people like, but why? Like getting this data is hard. And I was like, I know. And I don’t really have a why for you yet. I’m actually just trying to see what you are tracking. What do you think quality is and what do you measure? Because now I wanna pull it together into a single, cohesive conversation and be like, now when we look at this across the board. Hey, we have a problem right here. Should we focus in on that?

Matt: Yeah. That’s how you connect the dot. That’s right. And one thing you mentioned earlier, you talk about acceptance criteria. I’m curious your perspective on this when should quality members on the team, whether it’s a QA engineer or whatever role it may be, when should they be engaged in understanding the user stories, requirements, or whatever it may be?

Erika: At the very beginning with everybody else. Here’s the thing. Yeah. With quality team members have the benefit of constantly exercising the entire system. If somebody knows the ins and outs of your house and you have a problem, or you want to make an addition to your house, wouldn’t you call them first? Yeah, somebody who’s constantly, I’m thinking we just had a problem with our AC and we call somebody that this, the. It was the same guy that came out and fixed the AC problem we had last time. I’m not gonna talk about the shadiness that feels like, but he clearly said, he was like, yeah, he’s talking to my husband and he’s saying, cuz I wasn’t out there, but he was talking to my husband and he was like, yeah, this is what we talked about last time. Here’s this, that, and so forth and so on. Like he knew the problem. Which made getting to the resolution or understanding the, like just that knowledge push in made it so much quicker to get to the resolution and therefore cost us less money because he is out here less time, right? Yep. That’s qa. QA is constantly exercising your system from the customer perspective is, which is who we care about. QA is connected to the heart of the customer inside the business. 

Matt: Yeah, it’s the health of it. And I want everybody to like pause for a second just so you don’t miss this point. If you’re a scrum master product person or whatever it is, bring your QA folks into these ceremonies early on. Yeah. Because to your point they can save you. A lot of times they’re gonna be thinking about something from a different angle that you may not thinking about one, and they’re gonna be given additional context when they actually are doing the testing, which is gonna make their job. A lot easier. So anybody that’s not doing that br bring your QA friends into those conversations. Yes. Earlier 

Erika: on and I will I will point out like one thing that I often have had to do when I’m going into new orgs, when I have a new team, I have to coach inside of my team because the QA folks can be wallflowers at times, some of them can be wallflowers, and so they will come into a conversation and they’re like, yep, I’m listening. I’m actively listening. And that sounds odd, but okay. They know what they’re talking about and so I’m just gonna wait for it to come to my desk. And I have the context, but the QA organization, It’s one of the things I love talking to the QA community about. It’s like we are more than just testing in that single step in the delivery life cycle. We provide that value. We need to speak up, we need to provide the, here’s a gotcha, have you considered this? Have you turned the box in this way? And when you when, when team members, when were brought into those conversations, ask for that, pull on them, get the, request that feedback like, Hey, What do you think? Like I’m, these, this is these are the boxes. I love to talk about things in, in the form of boxes. So this is the box of the flow. These are the boxes of the flow. Currently, we wanna shove one right there. What do you think about that? Like what’s gonna happen? How does that help or harm the journey that you experience and go through and think about from a customer perspective? Is that good? Is that bad? Ask those very specific questions to, specifically to the QA team, to, to draw them out and get the, that insight. 

Matt: And that’s a facilitator like tip there, right? If you’re a Scrum master product person. Yeah. Like one thing that we do in a lot of workshops is we’ll always ask around the group, Hey Lisa, do you have any clarifying questions or anything like that? Bob, do you, and you go around the full room. And it’s funny a lot of times why people say no, but, and then they’ll go into what’s on their mind. So that’s a good tactic to get those, like you mentioned, wallflowers, to speak up. Cause they, they do have an opinion and it’s a valuable one a lot of the times. All right, so the hot topic right now, everybody’s talking about it. Everybody and their mom, generative ai. Now we’re playing around with GitHub co-pilot and some other tools at Hatch Works. But I’m curious, what is your perspective, thoughts, theory, whatever it may be? The prediction on how generative AI will impact the quality assurance discipline Positively, negatively, how it evolves. What’s your hot take? 

Erika: It’s significant. It’s significant. The thing to remember with all of the technologies, these are tools in our toolbox, yeah. I’ve heard the conversations, people in and out of QAO, the end of testers, the end of all of these things. But AI’s been building in the quality space for years now, for years. ChatGPT. I loved some chatGPT, right? Just being able to ask questions. It is another way to turn the box. It’s another way to leverage a tool to help us better communicate, to help us quickly write scripts, but just in general, like this generative ai, like the conversation around it, automated routine testing. It’s like it’s just generate generative AI can create new test cases that mimic the variety of user behavior and edge cases. Let it do it right? Yeah. We still, the humans still need to be in the conversation because we still need to analyze that. We need to, AI can handle those routine tasks, but we are analyzing it as humans. But it, it changes our role. And that’s the thing, like it doesn’t go away. It changes our role so that we can, it could be more cognitive. We can literally sit with something and think about it. As opposed to this is mundane, this is redundant. You know what people have said years ago, you’re just banging on testing, is just banging on a keyboard, which is not, it has never been. It is not, yeah. But it gets us even further away from that idea because now we’re like, let the machine take the inputs and generate something, and then let us tweak it to be more informed, more intuitive, more human. Let it let us use the machine to do predictive analysis, analyzing historical data to predict potential problem areas. Let it enhance performance testing or increasing QA accuracy, unbiased unbiased testing. This is a big one, so go further into that. So the story, right when the Apple Watch came out. Eventually became, one of the stories was like women are the biggest users of it. I don’t have data points. It’s been so many years. Women are the biggest users of this, but it does not have it does not have period monitoring on it. But yet women are the biggest users. It was a miss, right? Correct. Cause women were not included in that product team. They were not included in the usability testing. Like the, this was a miss, a big miss. And when it was added, like women were like, hallelujah. Thanks. But we have these bias, especially, it’s like you talk about like people you know in the DE D E N I space, and when you think about accessibility, I have bias. There’s, we all have them. I don’t. I don’t know what it feels like or what to consider directly when it comes to screen reading, not being able to read the screen. I don’t know, like what is better? What is a better experience, but that could be programmed into ai. Yeah. And there’s un like having unbiased testing supported with AI and then being able to be a lead, taking that information from like leaders in the space who understand it and plug those in as, excuse me, plug those in as models that that AI can use. So there’s so much opportunity to make it. To leverage this tool to create more efficiency, to create more impact, to be more valuable in the organization. But we’ve gotta, we can’t be scared of it. We can’t be scared. We need to look at it and be like, listen, you are mine and I am going to I know that you’re a hammer and there is a nail. I am not going to use you, to do these other things, but I’m gonna use you. Nail everything in cuz I know how you work. 

Matt: This is great and I love this. You have the eternal optimist mindset versus the pessimistic, it’s gonna take everybody’s job. And I love that cuz it’s an enablement view of, it gets me outta like the mundane, like stuff I don’t want to be doing. And it uplevels us as humans. It’s that, and that’s why I love how it’s positioned as, people talk about it as a copilot, we’re still in charge. Yeah. But it, but it’s helping enhance what we’re doing. Yeah. Really exciting stuff. I love where this is going and I love that you’re testing it and playing around with the tool versus waiting. Cause I think that’s where so many people miss, is once it becomes mainstream, then it’s like too late. And you’re like trying to play catch up mode, right? 

Erika: Yeah. Transparently, listen, I, when it first came out I was playing around with it and I was like, okay, here’s a requirement. Write a test case or tell me what the acceptance criteria for this requirement is. And it two seconds rattled some stuff off, and I was like, those are decent. All right, tell me what pesky now use this and tell me what the test cases are for this in it. Two seconds later, rattled off some pretty decent test cases. You know what? And I say that decent test cases with it. Not being informed, especially before it was had access to the internet, was it really not being informed? And just going off of if we’re talking about this thing, right? If you’ve told me this is the requirement, and giving it enough information to be informed enough so it doesn’t just say you’re gonna have to log into the system, so do that, right? Yeah. But instead if you’re, if you’re doing this functionality, here are some things that you’d want to connect. And then really deep diving in and saying what are some non-functional versus functional? What is security? What type of performance testing? How would I test these APIs? What type of data should I use? Where should I, what are some considerations? And just continuing the conversation. That was fun. Yeah. But it was scary at first cause I was like, oh yeah. Snapple. 

Matt: Yeah. It’s, and it, like you said it’s wow, this is decent. But what, connecting back to a point you made earlier where you had the example of somebody kind of being blinders on focus and suggest their organization, they didn’t think about how they were impacting others like this. This could be a use case right here where generative AI and the tools we’re using do have that purview across the entire organization to say, Hey, Are you considering this? Yeah. That maybe outside of your discipline, so like that’s an interesting kind of use case for this as it starts to evolve. I, I think it’s really exciting where 

Erika: to go. I want us to get into the point where we’re able to feed it. Privately feed it information and say, okay, now that you understand this ecosystem, Now that you understand our structure, our business flow, our business model, right? Now that you understand that, what should we innovate on? Yeah. What are the concerns with our product? Like now you’ve analyzed our tests and how our tests are performing. Should we innovate or should we fix tech debt? And what’s the impact, what’s the financial impact? AI can start to answer all of those questions just at a, just as a food keys, strokes. That is so exciting. Being able to like unpack, and I’m not saying to get to that point is significant. I get that yeah, what is the data that we feed it? How do we feed it that data? How do we protect privacy and da security, all of that stuff. I get that. Yeah. But man, Jetson’s opportunity there.

Matt: I always think back to the beginning of, cell phones, where they were to where they are today. Yeah. Nobody could have imagined where we are today, where, like, where the internet’s gone. I think it’s gonna be the same thing with generative AI in a lot of ways. So it’s gonna be fun to watch.

Erika: That’s to the iPhone, right? That’s what of AI is, it’s like that point and then, All products. Now all phones follow that same view. Every phone is that, smartphone view based off of what Apple did. Nobody has a razor flip phone. Some dudes still have, I remember the Verizon little like brick thing that slid up and stuff like that. That was the cool thing. Yeah, not more right? Like everybody, a sidekick. Yep, that’s right. Yeah. Generative ai. That’s where we are at right now and it’s. I cannot wait. 

Matt: That’s awesome. All right, so let’s do a couple quick rapid fire questions to wrap it up. Okay. First thing that comes to mind. What company is doing qa, right? Which is there somebody in the community that you’re like, oh they’re really good at it. 

Erika: I, that’s not a, that’s not a fair question. It’s it’s subjective, right? Everybody’s doing something right. Can I plea the fifth? Everybody’s doing something right? Yes. Everybody has opportunity. Yeah. I worked at Calendly. I’m gonna say Calendly doing it right? Yeah. 

Matt: Shout out Calendly though later. 

Erika: Yeah. Yeah. Teams are looking to improve. There’s, there’s a lot of great things that Realtor is doing. There’s still opportunity. There’s opportunity At Calendly, there was opportunity at cabbage. It’s just about the focus and so yeah. I’m gonna complete the fifth. 

Matt: No those are good answers. What about an individual? Is there anybody in the QA community that you follow or, Think is influential. 

Erika: Angie Jones is amazing. Lisa Crispin and Janet Gregory are the agile queens. Like just, the, they have the Bible. Three of them actually on agile testing and processes. Come top of mind for me. 

Matt: For sure. Like those, like  it, it send their LinkedIn to, we’ll put ’em in the show notes for some folks who may be interested to start following them. And what’s one thing that you wish you could go back to like your former self and give some advice to your former self if you could go back?

Erika: It’s usually not about, it’s not about quality. 

Matt: It doesn’t, has to be, it’d be anything.

Erika: Own what you know. Don’t worry about what you don’t own what you know, because what you know is impactful. It’s important and it’s valuable. And when you spend time worrying about what you don’t celebrate and champion and communicate to others what you are excellent at, and therefore you don’t continue to hone it. It’s okay to know that you have what the gaps are. If you want to work towards filling them, but some gaps I don’t want to learn how to surf and that’s okay. Yeah. I’m not a surfer and I don’t want to learn how to surf. I like to swim and I want to learn how to become a better swimmer Still in water, right? Yeah. So it’s Hey, what are you excellent at? And what are your, what do your passions lie? So own what you know and lean into that and don’t worry about. Don’t worry about. 

Matt: I love that. And one thing you mentioned earlier, and just to wrap it up yeah. One thing I love about your experience, what you do is your involvement in kind of women in tech and the diversity and inclusion space. Anything to speak about the, I see your kind of involved with the women in tech and career coaching there. Yeah. Anything that you are either excited about within this space or how you’re helping folks in this area.

Erika: Yeah, I, as a woman in tech myself I’ve spent the better part of my career being the only woman in the room, especially as a leader, being the only woman in the room also being the only black person in the room and that can be difficult. It has been difficult and I’ve had to learn how to manage my my own imposter monster. I’ve had to learn how to manage my voice and showing up the way that is right for me and not worrying so much about what others, how others think I should show up. I had somebody tell me I should be more docile and quiet because certain genders should be docile and quiet, that I should modulate my tone. And so I’m passionate about coaching women especially. Because I spent a lot of my career not being confident about who I was and how I showed up and second guessing and not speaking up when I should have spoke up or not owning what I knew, and so I’m excited about that and I love to talk to women about that in, in the space and help them.

Matt: Yeah. And then the community element’s so important too, I think. Having that community of folks that are going through, the same thing, they can trade stories and I got two young daughters at home, so I appreciate you pioneering the way for women in tech as they come up. You’re an awesome role model there. Yes. But where just to wrap it up, where can people find you, whether it be LinkedIn or what you’re doing? Anything you wanna. Plug here at the end. 

Erika: Ah, yeah. I’m Erica Chestnut on LinkedIn. Please feel free to reach out. I love to talk about quality. I’m a bit of a dork about it and obviously I love to talk about women in tech in general. But you can also reach out to me on ericachestnut.com. That’s where you’ll learn a little bit about my leadership consulting and my. Women in tech coaching and my quality leadership consulting and coaching all things I love to do. I’m really passionate about coaching and supporting people, either women in tech or in the coaching, or excuse me, in the quality sphere. Feel free to reach out. I’m around. 

Matt: Awesome. Thanks Erica. Appreciate the conversation. Thanks for joining Built Right. 

Erika: Thanks.

More episodes