About this podcast episode

🚀 The human still decides, while the AI provides

Aaron Upright, Co-founder and Head of Marketing at ZenHub, shares how ZenHub is adding innovative AI solutions into their product to promote development team efficiency.

Bill and Aaron share real-world examples of how AI can complement teams, how to prioritize AI projects, and the importance of customer feedback and testing.

In this podcast, you will learn the following:

✅ The transformative role of AI in enhancing agile team efficiency

✅ Strategies for seamless workflow management in software development teams

✅ Innovations in sprint planning and reviews through AI integration

🎉 Upholding values and ethics in AI implementation for project management

Transcript

(transcripts are auto-generated, so please excuse the brevity)

[00:00:00] Introducing Aaron Upright

Bill Raymond: Hi, and welcome to the Agile in Action podcast with Bill Raymond. Today I’m joined by Aaron Upright. Hi, Aaron. How are you today?

Aaron Upright: I’m doing well, Bill. Thanks for having me.

Bill Raymond: Yeah. I’m looking forward to talking to you today. We’re going to talk about promoting efficiency and accuracy in agile teams using AI.

[00:00:18] The Origin and Evolution of Zenhub

Bill Raymond: And You are the co founder of Zenhub and you are also the Head of Marketing.

Bill Raymond: Can you introduce yourself a little bit more so our audience can learn some more about you?

Aaron Upright: Yeah like you mentioned, I am the Co-founder and Head of Marketing of Zenhub. We are an agile project management and road mapping tool that’s built for software teams. And Zenhub got its start in a pretty unique way. We didn’t actually set out to go and build a product. We sought out to go and solve a problem that we had for ourselves when it came to how we were managing our projects, which was we felt it was always very disconnected from the code and where as engineers, we were spending a lot of our time.

Aaron Upright: The original version of Zenhub that we actually went and built was a very simple Chrome plugin for GitHub. To bring a lot of project management elements to that platform and really turn GitHub from a very simple issue tracking tool into a kind of fully fledged project management tool that we could use to track the priority and progress of our issues.

Aaron Upright: Zenhub hasevolved a lot over time, which I’m sure we’ll talk about in the the podcast here today. But that’s a bit about me and a bit about where Zenhub got its start.

[00:01:18] The Role of AI in agile Teams

Bill Raymond: At this point, everyone in the audience has probably learned that we’re going to be talking about software development, but I think this is going to be important for anyone that’s looking to manage an agile team. There’s all this conversation about whether or not artificial intelligence is going to replace engineering.

Bill Raymond: I think we’re going to come to the conclusion that it won’t, at least not in any kind of near term situation, right? But we are going to talk about how AI can improve the efficiency of agile teams.

[00:01:49] Understanding the Workflow of a Software Development Team

Bill Raymond: Now I guess it would probably be useful if you could just take a moment and talk about the typical workflow of a software development team at a high level.

Aaron Upright: Rather than thinking about a specific methodology or specific framework and trying to support that, what we really tend to focus on at Zenhub is: what are those common jobs to be done that every software team has to perform and every software team has to do to some level in order to deliver software to customers?

Aaron Upright: It usually involves the team coming together on a daily basis to discuss what’s being worked on, where the blockers are. Sometimes in scrum that’s called a daily standup or daily scrum, even if you’re not a scrum team, teams tend to practice that ritual on a daily basis. We often see teams come together every couple of weeks to talk about their process, what’s gone well, what’s not going well.

Aaron Upright: What actions they can take or experiments they can run to go and improve that process. Again, in a very scrum centric world, that’s called a retrospective. But even if you don’t follow scrum or that methodology to a tee, typically we see teams coming together, every couple of weeks to discuss how they’re doing and opportunities to improve.

Aaron Upright: So a lot of what we’ll be focusing on today and a lot of what we’ll be talking about is through that general lens of not how does AI and automation in particular, how does it apply to a framework? But how can we apply it to some of these really common jobs to be done that we see every software team doing?

Aaron Upright: And frankly, that we see every software team spending too much time running too much time, running these events rather than focusing on the work that matters.

Bill Raymond: Yeah, that makes sense. I appreciate that. I think some of the products that you see out there on the market, they just slap AI on there. And then it just basically has OpenAI typing things for them. But I think you’re going at this in a different direction, aren’t you?

Aaron Upright: Yeah. Again, just to come back to those jobs to be done. We’re really looking at, where are teams spending the majority of their time and how can we use automation in AI to make those teams more efficient? So you talked about some of the products that exist out there in the market where

Aaron Upright: a bot or a note taker that then would translate, conversation points from a meeting into a Jira ticket or into a Zenhub ticket even those are great. And I think those definitely serve a place in the market, but let’s say you’re not very efficient at running a teen standup and that standup goes on for 45 minutes.

Aaron Upright: It’s great to have a note taker there that’s translating all those action items into tickets. But it’s not really addressing what we see as a fundamental problem, which is your standup is taking 45 minutes. And so a lot of the AI and automation that we built into our product, and really how we think about leveraging those two technologies is really aimed at how do we help teams be more efficient.

Aaron Upright: And that’s really, our ethos at Zenhub is how do we help unlock the productivity of software teams and unlock the team members to spend time on, on work that really matters. So for developers, how do we do everything in our power to help them spend less time in planning meetings and more time actually writing code and developing software?

Aaron Upright: We’re gonna be a scrum master on the team or a project manager. How do we help that person better prepare for a sprint planning meeting or better prepare for a daily standup so that meeting can be run more efficiently. And instead of taking an hour and a half to plan a sprint, they can plan that sprint in 30 minutes, go on with their day, and then invest that time into creative ways to help the team or unblock the team on something that they might be blocked on.

Aaron Upright: So that’s really how we think about the application of AI within our product.

Bill Raymond: Yeah, and I think that really resonates with people. We talk in the agile world generally about a stand up meeting being, what, 15 minutes, 20 minutes, something along those lines. We try to keep the meeting short so that people can go back and do their work.

Bill Raymond: And if it’s going to take 45 minutes or an hour, Then we are killing this whole concept of agility. And of course, we’ve talked to people about how to improve teamwork and efficiency, how people can take coaching courses to learn how to improve those things, but at the end of the day, there’s always work that a person needs to do to prepare for one of these meetings, even if it is just 15 minutes.

[00:05:44] The Impact of AI on Daily Standups

Bill Raymond: So what are some of the things that you do to improve these daily standups with your software?

Aaron Upright: Yeah. So some of the things that we do we make it really easy for teams to run a standup and we do that asynchronously in our product. One of the things that’s always really bugged me about standups in particular as a ritual is that there’s this perception that, a standup can only happen every morning.

Aaron Upright: It’s only in the morning, over coffee that the team can come together and talk about what they’re blocked on or what they’re working on or what they’re struggling with. When in reality, that should be something that comes up asynchronously over the course of the day. When someone gets blocked in the afternoon, we shouldn’t have to wait until the next day’s morning or the next day’s stand up in order to talk about that. That should be visible to everyone on the team, we should be able to action that immediately to unblock that individual.

Aaron Upright: So one of the things that we’ve done with our stand up experience in our product, what we call daily feed. Is we made it really easy for team members to see where they’re blocked, where they’re holding up a code review, where they’re waiting on a code review in real time and asynchronously. So a developer can go in any time to that daily feed and see, hey, one of my team members is actually blocked on a code review.

Aaron Upright: I can go help them and unblock them so they can keep working this afternoon if I go in and help out with that code review. We’ve done that in a way and built it into a way in our product that it really feels asynchronous. So if you’re working across time zones, if you’re working across different geographies, it’s really helpful too so that you’re not waiting until the team comes together the next morning in order to address that blocker or alleviate that blocker for a team member.

Aaron Upright: We think really not only cuts down on the amount of meetings but it helps people be more efficient and reduces context switching as well.

[00:07:20] The Role of AI in Sprint Planning and Reviews

Bill Raymond: We tend in agile teams to plan for a week, maybe two weeks, maybe not more than three. We try to plan that work out so that we can then step back and do something like a sprint review and check and make sure that we’re still on track, and we’re meeting our objectives, and we still understand their outcomes, and all that good stuff.

Bill Raymond: How might AI help with that process when you’re sitting back and reviewing that previous sprint?

Aaron Upright: Yeah. So there’s a couple of things that come to mind for us. And they’re both experiences that we’ve started really building towards in our product. Just taking a step back from AI and looking broadly at automation. One of the things that we realized is that sprints are actually a really good candidate for automation, right?

Aaron Upright: If you’ve done the work and invested upfront as a team into refining your backlog, estimating issues or the complexity of issues. Sprint planning is largely a mathematical exercise, right? You understand your velocity as a team, right? Maybe you can get 20 or 30 story points of work done over a two week iteration.

Aaron Upright: If you’ve done the hard work of grooming your backlog, making sure the highest priority issues are at the top, making sure all that work has been estimated appropriately by the team, then the actual act of planning a sprint is just math, right? How many issues can you actually bring into that sprint that satisfy that relative velocity.

Aaron Upright: So one of the things we’ve actually done in our product to start to build some automation into sprint planning, we’ve actually given teams the opportunity to automatically build what we call a sprint candidate. So it’ll take their backlog, Zenhub will select up to the first 20 story points or 30 story points.

Aaron Upright: We’ll form a sprint candidate around that and we’ll hand that off to the product manager or scrum master and say you can do the rest of the 20%, right? You can pick and choose maybe which issues you want to include or don’t want to include, but 80 percent of the work has already been done for you as opposed to having to start from scratch, go create a sprint manually.

Aaron Upright: Manually add every single issue that you want in there. We’ve done a lot of that work to really try to help streamline that event for teams. On the back end of that, where you talked about sprint summaries and sprint reviews, that’s another really important event. And just like a lot of these other agile events and jobs to be done, it’s an area where we see teams spending a lot of time, right?

Aaron Upright: Trying to pull together what was actually accomplished, what was actually done. Trying to write a summary in a human readable format relative to the goal. Hey, here’s how we performed. Here’s what we didn’t do. All of that takes time and effort from a project manager or a scrum master or a product owner, depending on who’s doing that work on the team.

Aaron Upright: When we talk to our customers, we saw a team spending an hour or two on that every other week. And so that was one opportunity where we said AI is great at summarizing information, right? You can give it a whole paragraph of text and ask it to come up with, a one or two sentence description of, that text, right?

Aaron Upright: And so what we said with our AI sprint reviews is let’s give, AI all that information that happened within a sprint, all those issues that were opened, all those issues that were closed, all those issues that were left open. And let’s ask it to generate a summary of what happened. And the output of that is, in a very human readable form, a summary of the sprint, along with a categorization of all the work that was done, grouped into core themes.

Aaron Upright: Hey, all of this work touched on our API. All this work and it touched on our back end systems. There’s maybe some miscellaneous issues or some tech debt that we introduced in there as well. With the click of a button in our product, we’ve made that possible to try to reduce the amount of manual effort and manual time that teams need to spend on that.

Bill Raymond: Oh, so that’s great. It’s almost like, you’re going to walk into that sprint review meeting and this virtual helper has helped you sort out what the review is going to look like. And it’s also called out some of the things that are going to be very important to chat about.

Aaron Upright: Yeah. And we see that as a great starting point for the agenda for a sprint review meeting, right? I still think there’s a huge value in getting the team together, reviewing work, celebrating wins. Sprint reviews are an exciting time. Everyone gets to come together and showcase what they’ve been working on.

Aaron Upright: And understandably, people are proud of the work that they’ve done. They want to demonstrate it to their colleagues or to their peers. We don’t want to take anything away from that and completely automate that event. But if we can start with a really strong agenda and then have a really strong deliverable that a product manager can use afterwards and say "hey, here’s the recording of our sprint review."

Aaron Upright: If you want to go watch it, here’s a summary of what we got done in this sprint and deliver that to stakeholders or leaders elsewhere in the organization. It’s a really great outcome. And again it’s helping save them time that they otherwise would have had to do to manually collect those updates, pull them together, write a summary around that, group that work into themes and then deliver that to a stakeholder or a leader in the organization.

Bill Raymond: You started off this conversation by taking a bit of a higher level view of how you plan the work. And a big part of that is estimating the work, so new work comes in, whether they are bugs to be fixed or new product features or small enhancements, all that stuff.

Bill Raymond: It all needs to be broken down and estimated.

[00:12:19] The Potential of AI in Estimating Work

Bill Raymond: Can you talk a little bit about what that process looks like and then how you’re seeing AI support that?

Aaron Upright: Candidly, estimation is not an experience that we built on our product. Using AI today, it’s a very manual process, but it’s one we’re really fascinated in bringing AI to because we think there’s a lot of potential.

Aaron Upright: When it comes to estimating software in particular, using story points or doing comparative estimates, as you might say, of how complex is this piece of work relative to another piece of work that we’ve worked on? That’s what we see teams doing a lot is whether it’s, story points or t-shirt sizes.

Aaron Upright: How complex is this particular task relative to other historical tasks that we’ve done or maybe accomplished at some prior time? And that’s where I think we see a big opportunity for AI in our product to help support that, which is leveraging AI to go find those similar issues, right? And then come up with a comparative estimate based on how complex those previous issues were.

Aaron Upright: That’s something we’re really interested in investing in. It’s something that is in the works, not something that we have live in our product, but we think it can be really powerful where, maybe you have a Zenhub agent that’s sitting in on a planning poker meeting. For those of you that aren’t familiar with planning poker, a ritual where the team comes together and everyone provides their estimate on how complex a piece of work can be.

Aaron Upright: We’ve thought about what if Zenhub could sit in on that meeting and be a participant in that planning poker? Offer up its own estimate based on historical work that we’ve seen the team accomplish and how complex that work is. It’s been something we’ve been thinking a lot in an area that personally really fascinates me because estimation is one of the most contentious things in software development.

Aaron Upright: Oftentimes there’s a lot of disagreements and a lot of people that don’t always see eye to eye on things. So if we can come in and provide that more objective lens and use AI to kind of power that estimate, again, we think that can really help facilitate that event that teams do.

Bill Raymond: I was working on this project implementing some third party software, but we were tailoring it to the customer’s needs and the customer said, you know what, this tailoring, it’s not working. And to me, tailoring is you’re modifying some settings that you may not normally modify in the tool but they’re there and you can make it work.

Bill Raymond: But then there’s customization where you, let’s say, add some button that the product never had before. And you’re literally putting that in there or what have you. And I remember with this particular product, it was based on a framework based on another framework and to get that button into the tool, we thought it would just be, maybe a week or so of the developer’s time.

Bill Raymond: And then that button would run some code in the background, but it turned out we had all these layers of security. And then of course there was internationalization, localization. And so this button took four and a half weeks. And I remembered I moved on in that project, and we’re maybe a year later, we’re still working on some fairly big elements of that.

Bill Raymond: And guess what, the question of a button came up again, and putting that in there. We had a turnover in the development team. And no one knew about that. So they were all estimating a week again. So I love this idea of the agent come back and say, "Hey, wait a minute, there’s proof back here that this is actually not a "1", a simple task, but it’s actually something that was, it’s going to take multiple sprints to complete."

Aaron Upright: Exactly. A human being is not going to have that recollection. They may not remember, you might’ve moved on from the company, you might’ve moved teams and now it’s a whole new team of developers that are coming together to I don’t know, rebuild that experience or partially rebuild that experience.

Aaron Upright: This happens a ton in large enterprises that we work with as well, where oftentimes, one team is building a component that another team has already built, and could just be intersourced or re-leveraged, or maybe they’re building it, but in a slightly different environment or in a slightly different way.

Aaron Upright: I think there’s a lot of skepticism when it comes to software development and project management in general. I think if we were just to say, "Hey, Zenhub thinks that this issue is a story point value of the "3", or we think it’s a medium t-shirt size," and we didn’t give any explanation of how we came up with that, I think people rightfully would sayI’m not sure I can trust that, or how did it come up with that estimate? But if we can say we came up with a story point estimate of "3", we thought this was a medium based on these three or four other issues that we looked at.

Aaron Upright: That not only helps provide more trust and explainability, but it also gives the humans that are then going to go and do that work and come up with their own estimates, maybe some reference points that they might not have found.

[00:16:39] The Importance of Values in AI Implementation

Bill Raymond: Yeah, I think that actually leads into something that I’ve been thinking about a lot lately, and you and I did touch on this the last time we chatted, it’s around values and how you focus AI in your product. It’s interesting, you sit down and you use ChatGPT, for example, and you just ask it a question and it seems to magically know the answer.

Bill Raymond: Sometimes it doesn’t really know the answer and it’s spitting out information that may not be accurate at all but very often you get some very kind of good conversation going and you can chat with it, like it’s another human being. But there’s also times when people are using it.

Bill Raymond: For example, we’ve all heard the stories about how lawyers have cited cases and brought that to the court and turns out they’re not real cases and things like that. And project management, we are a big industry, software development, we’re a big industry. So there’s so much data that we can pull from and we might think that the AI is super accurate, or we might think AI can do certain things for us that maybe it really can’t.

Bill Raymond: So I’m just going on about this for a little bit, because I wanted to hear your thoughts on what are some values that you use when you’re using AI in the product?

Aaron Upright: Yeah, explainability is one that is a big one, right? We already talked about that.

Aaron Upright: Another big thing that we’ve really made core to our principles of how we’ve developed our product and how we’ve been developing with AI automation is that human beings should always have the final say, right? When we think about the application of automation and AI into our product, we think, how do we accelerate and make faster those jobs to be done or make them easier for people, not how do we completely take them off their plate?

Aaron Upright: So how do we run a better daily standup versus how do we get rid of daily standups? How do we help teams run better sprint planning meetings versus how to get rid of the sprint planning meeting? We’re not about getting rid of these events or minimalizing them or making them less impactful.

Aaron Upright: We’re about helping teams get through them faster and more efficiently at the same level of quality as they would if they had, maybe spent that much time preparing or doing it. That principle or concept of look, we always want to give humans the final say is something that’s really core to our product.

Aaron Upright: I talked about automated sprint planning and some of the work that we’ve done there. Again, the purpose of that is not to completely automate the sprint end to end it’s to provide a sprint candidate that we think is 80% of the way there. And then as a project manager or scrum master or product manager, you can finish off that other 20%.

Aaron Upright: But hopefully in doing that, we’ve saved you 15 to 30 minutes that you otherwise would have spent on that.

Aaron Upright: I think it’s the same thing for a lot of these other AI experiences in our product, like a sprint summary, right? Always give people the opportunity to edit that. Once you generate that summary, it’s not like it’s locked in a PDF and the only option you have is to export that or email it to, you know, your stakeholder. You can come in and edit that description, provide some options to make that output shorter or longer. We even provide a button that says make it more technical or less technical if you want to include more technical language or less technical language in that.

Aaron Upright: So that concept of we can do a lot of the work. But humans should always have some involvement in that process is something that’s really important to us. And I think, agile project management isn’t just about speed and velocity, it’s about alignment, getting people together and making sure that people are aligned on the goals and what they’re actually going to be developing.

Aaron Upright: So that’s a big reason why we don’t want to completely remove humans from the equation. We don’t want people to be asleep at the wheel on autopilot or copilot, whatever you want to call it. Just letting these AI systems do the work. We want people actively involved in thinking about what they’re building and how they’re going to build it and why it’s a problem and what value it’s going to provide to customers, not just completely outsourcing that work to AI.

Bill Raymond: Yeah, I find that it’s also useful when you’re trying to communicate to different types of people. For example I was working with a software development team recently, and they talk a lot to their customers, and they’ve defined what some of the outcomes were, and they look very positive, and they look like they can actually help improve customer growth.

Bill Raymond: But, then you go and you share this information with the CFO or you share it with the marketing team. Those terminologies may not mean as much to them.

Bill Raymond: I watched one of the developers take some of their agreed upon, prioritized, and money making features and re-explain it for their different audiences, and then they can put that into a deck, tweak the content, and then share that with everyone. And that to me is a great approach that you’re taking, which is give me that little head start, give you that 80% so that I can make that 20% really good and still communicate with the other human beings on my team.

Aaron Upright: Exactly. Yeah. And it really comes back again to that core principle of how do we make teams more productive? How do we help teams save time? How do we help individuals on that team save time and reduce the amount of just overhead that typically comes with running agile process so that they could focus on that work that really matters.

[00:21:38] The Mindset for Building AI into Your Product

Bill Raymond: I think we’ll have a number of software developers listening to this podcast. You just gave us some really good insight, thinking about the experiences, making sure that you’re never formally just making a decision on someone’s behalf, giving that human the ability to move things forward.

Bill Raymond: What is the mindset that you should take when you’re building AI into your product?

Aaron Upright: I think it comes back to any new technology that you’re adopting is like really thinking about the problem that you’re trying to solve, right? We can embed ChatpGPT into anything these days. But if it’s not solving a core problem or addressing a pain point that someone has, what value is it really providing within the product other than it’s a cool new technology we’re taking advantage of and maybe we look a little bit more innovative in the next solution or tool that’s out there. Before we went on this journey with AI, we really thought about what are the problems that we’re trying to solve for here?

Aaron Upright: What’s the feedback that we’re hearing from our customers where we think AI can help address that accuracy issue or that categorization issue that might be perennially coming up from teams, right? So those are the things we really thought about before we just went down this path of incorporating AI into our product.

Aaron Upright: We also thought about how can we take some of this kind of grunt work off of teams, not only because it reduces friction for them, but maybe it makes other parts of our product more powerful.

Aaron Upright: For example, one of the first experiences that we built was AI-suggested labels, right? You type in the description and the title of an issue, and we’ll generate some labels based on your existing set of labels that you use. We’ll maybe suggest that it’s a bug if you’re using language that indicates that there’s a problem that a user’s encountering. That’s an easy example, right?

Aaron Upright: But that categorization of work that otherwise would be manual and extra overhead for a team to do, not only helps reduce clicks and improve efficiency, but it makes a lot of the other experiences in our product way more valuable.

Aaron Upright: So when you’re looking at a roadmap in our product or a burndown chart in our product or a velocity report, you can then start to group that work by label and say, "we spent a lot of time on bugs this sprint. Did we actually need to do that?" Or "did that kind of work creep its way into the sprint and actually expand the scope?" So it kind of served that dual purpose for us, right? It gave the user something of value, but it also meant that something further downstream in our product, like a roadmap or report was valuable.

Aaron Upright: So when we were thinking again about building with AI, that’s why we were really thinking about what problems can this solve, right? And how would this actually benefit the user versus how could we implement this in a cool way and just get something out the door super quickly so we can slap an AI label on our product?

Bill Raymond: Yeah. And I think that’s actually a really interesting approach because the other thing that maybe you, you didn’t mention here is that gave you some insights as to what it might be like to build some more AI into your product without taking such a high stakes game, such as trying to automate all of the poker planning.

Aaron Upright: Exactly. Yeah, we’ve been kind of biting it off in terms of, you know, little bits and pieces here and there. How can we improve one experience that teams use kind of day to day versus how do we, maybe do something more substantial, but on an experience that, has only ever used, every other week in our product.

Aaron Upright: And so we’ve been trying to bite off kind of small chunks and experiment with bringing AI into our product that way. And to your point, it’s actually led to some really interesting learnings too, from an adoption standpoint, because it seems like AI has been around for a long time.

Aaron Upright: We’ve certainly been talking about it for a long time. But it’s only, started making its way into products fairly recently. And we’re seeing companies that are really excited about AI and, bringing in AI tools into their workflow and to their team members. But there’s still this level of cautious optimism that we see out in the market of companies that haven’t quite figured out their AI policy or whether or not they can adopt tools that have, AI components or AI built into them. And so in starting with small chunks, it’s also been a really good learning experience because it’s allowed us to see how people are reacting to these experiences. How many people are going to opt in on day one, versus how how much more marketing do we have to do around the benefits of this and not only the benefits, but the, security, and compliance things that we’ve gone through to ensure that we’re building this with safety in mind, right? Or with it with with ethics in mind, right?

Aaron Upright: So it’s given us a real perspective on that too of, okay, we launched this feature. It was a small one. People are interested in it, but they have a lot of questions on this. Okay. We need to develop more marketing material or uh, just more, more material in general on, on addressing this to make people feel maybe a little bit more calm or a little bit more secure before they just jump in.

Aaron Upright: I’ve found that very often there’s feature bloat that occurs in products where you’ve put so many capabilities into it that, it’s almost unmanageable from a software developer’s perspective, but think about the poor customer who has to go four levels deep into menus and screens in order to get to something.

Bill Raymond: To me, this is an opportunity for AI to handle some of the: "Oh, I think you’re trying to accomplish this. Let me see if I can help you with it."

Aaron Upright: Yeah, a hundred percent. And our product team probably won’t appreciate me saying this, but I think, as a product that’s been in the market for nine years, we already have a lot of different features and functionality. Some might call that bloat. We want it to be very careful before we just added new experiences on that for the sake of leveraging a cool technology.

Aaron Upright: That’s why we really came back to how do we embed this into things that people are doing every day in our product, to make those experiences fundamentally better and easier for teams.

Bill Raymond: What are some of the things that you learned at Zenhub that a software developer can take away to start improving their processes with AI?

Aaron Upright: That’s a really good question. I think a lot of it, again, comes back to that concept I talked about of what goal, are you trying to accomplish? What is the problem you’re trying to solve? I’m getting really clear on that. And again, not just with, AI, but with any new technology that you would be embedding into a product.

Aaron Upright: Okay. How are you doing this in a way that actually provides value to the user, right? If you can’t answer that question or come up with a captivating answer to that question, then, chances are what you’re going to be building is not going to be that helpful. We’re not going to be that valuable.

Aaron Upright: The other thing that we did with all of our AI experiences is that we launched them in beta and they’re still in beta, right? Although they are feature complete, a hundred percent accurate, we’re still refining them, we’re still making them better.

Aaron Upright: And so we saw a real opportunity as we were building with AI to not just have it be this thing we were doing in the background and then every once in a while we’d come up with this cool feature. We really wanted to actively incorporate people into the process and our customers into the process as well.

Aaron Upright: That was something that was really important to us. And so before we actually launched any of these features to our customers, we built up quite a significant list of people that wanted to use them in beta that were committed to providing us with feedback saying, guys, this actually isn’t that useful, like it’s interesting, but it’s not that useful.

Aaron Upright: It doesn’t solve a fundamental core problem that I have. And we’ve actually, thrown some of those experiences out, right?Maybe we actually weren’t solving a real problem with that. It was a cool application of AI, but it’s not actually driving any value for the user, or it’s cool application of automation, but it takes away from a lot of these events, one of the things that we were experimenting with was being able to really embed like a video recording into the ah, the sprint summary as well, so that you could have this kind of nice roundup and we could replace that event entirely and say, look, here’s your, video.

Aaron Upright: Everyone has come on camera in three minutes and said what they’re working on. And then we’ve collected that together into a video and edited that and grouped them all together there. We found people say is, look, we really don’t need automation and AI in that event. It’s something we like to do.

Aaron Upright: We like to come together every couple of weeks and talk about what we’ve got done and celebrate those wins as a team. And if people are blocked or if they’re challenged, or if we’ve taken a different direction on things, we like to know about those as well. So that was a particular case where people weren’t so excited about automation and building with AI in that feature set. I guess to, to bring it back to where I started is, really have that problem in mind. And for anyone building with AI, if you have customers, utilize them, bring them in, bring them close to what you’re building, don’t, try to perfect it behind the scenes and then roll it out and hope that people will be interested enough to pick it up.

Aaron Upright: Really, come to them with unpolished early experiences and make sure that what you’re building, is satisfying or answering a question they have or satisfying a challenge, or problem that they have, that was something that was really big in terms of how we we built with AI.

[00:29:49] Conclusion and Contact Information

Bill Raymond: Aaron Upright, Is there some way that people can reach out to you if they’d like to talk to you about this further?

Aaron Upright: Yeah. I love to connect with people on LinkedIn. So if you want to connect with me there, I’m Aaron Upright on LinkedIn. Send me a request and love to talk about a lot of these same topics and what we’re seeing in our customers and how they’re dealing with agile transformations and digital transformations, so I’m happy to chat and connect with anyone there.

Bill Raymond: Great. And I will share that link, and I’ll also share the article that we saw that actually caught our attention that had us reach out to you, which was "Will AI End agile As We Know It?"

Bill Raymond: So all that information will be on the https://agileinaction.com website. It’ll also be in the show notes of your podcast app or in the description if you’re watching this video on YouTube. Aaron Upright, thank you so much for your time today!

Aaron Upright: Yeah. Thanks so much for having me. Really appreciate it!

Speaker: Thank you for listening to the Agile in Action Podcast with Bill Raymond. Subscribe now to stay current on the latest trends in team, organization, and agile techniques. Please take a moment to rate and comment to help us grow our community. This podcast is produced in affiliation with Cambermast LLC, and our executive producer is Reama Dagasan.

Speaker: If there is a topic you would like Bill to cover, contact him directly at Bill.Raymond@agileinaction.com.