Generative AI: Augmenting or Replacing Research?

Generative AI is making an impact on every aspect of digital product building. But we wanted to delve deeper into how it’s affecting user research and interviews, so we invited Nisha Iyer, CTO of CoNote, onto the Built Right podcast to share industry insights and predictions. 

Nisha shares the story of CoNote, an AI-empowered platform helping transcribe and organize user research. We hear her thoughts on GenAI skepticism and how CoNote builds on customer feedback to improve its efficiency. Plus, Nisha tells us her predictions for GenAI in user research and whether it could eventually manage user interviews entirely. 

Read on for the take-home moments or tune into the podcast episode below. 

How GenAI can help user research today 

In Nisha’s previous work in data science, the slow process of performing user interviews, transcribing them, analyzing them and acting on the relevant insights became tedious.  

After Google-searching for an AI solution and creating a few shortcuts herself, Nisha realized no-one was providing quite what she needed. There was a market for an end-to-end generative AI tool that streamlined these processes. That’s when CoNote was born. 

CoNote allows you to: 

  • Upload hours of user research interviews 
  • Transcribe and synthesize them 
  • See the key themes and keywords

Building a moat in the age of AI hype 

Right now, it seems like every day brings a new generative AI tool. With democratization in full flow and more people able to access larger language models, how do CoNote build a moat and stand out from their competitors? 

Nisha says user experience has always been their watchword while it often falls by the wayside for competitors. Development teams may integrate APIs and build their tech but, if you’re building a SaaS product and don’t have an intuitive front end, interest could dry up. 

CoNote’s moat is that they’re not simply consuming APIs. They have other pieces of infrastructure to keep them one step ahead of their competitors. 

Another thing at the core of what they do is a deep understanding of their users. Nisha believes CoNote provides a “simplistic flow” for the user to reach the solution to their pain point. 

How customers shape CoNote’s roadmap 

When building a brand-new tool, product development teams tend to devise a roadmap. But how much of that roadmap is pre-determined and how much is changed along the way, based on customer feedback? 

Nisha says CoNote’s ever-evolving roadmap is made up of around 70% user feedback and 30% CoNote’s own decisions. 

This is evident in the launch of their new feature, Action Items, which is stemmed from repeat customer feedback and highlights the next steps users can take after using the product. 

When running the first round of CoNote interviews at the prototype stage, many of the themes and action items that arose resulted in relevant features being built into the product, such as the use of audio recordings and Zoom integration.  

Nisha says the fact they use their own product as part of their work gives them an even better insight into the changes and features that need to be added. 

Overcoming AI skepticism 

A recent User Interviews survey found that 44.3% of UX researchers are tentative about using AI in their work, as well as 5.5% of CX professionals, 9% of data scientists and 21% of product managers. 

But, in 2023, generative AI is almost inescapable. So how can product development teams fight their fears and use AI in ways that augment their processes – without taking them over? 

Nisha says, rather than fearing its potential, it’s important to focus on generative AI as a way to enhance the tedious parts of your work and do what you could do in a week in a matter of minutes. 

CoNote is a prime example of this. It takes you 85% of the way through the user interview process, leaving you with the materials you need to pull the most useful insights. 

Predictions for GenAI and user research 

Nisha believes there’s still a way to go before AI is taking on interviews all by itself. She sees a future where AI can replicate human-led experiences but says real, personal interaction is still the most efficient way to perform user interviews. 

CoNote has no plans to create AI-led interview experiences, instead focusing on augmenting the cycle and making development teams’ lives easier. 

To find out more about CoNote’s story and how generative AI is changing the face of user research, listen to episode 16 of the Built Right podcast. 

Get ahead in software development with HatchWorks’ AI-driven strategies – learn about Generative-Driven Development™ now.

Matt (00:01.944)

Welcome, Built Right listeners. Today we’re chatting with Nisha Iyer, CTO of Conote. Conote makes qualitative research fast and easy with its AI-empowered platform, helping transcribe, analyze, and organize user interviews. It’s a tool built for user researchers, which at Hatchworks, that’s a big part of what we do, so we can definitely sympathize with that type of tool. But welcome to the show, Nisha.

 

Nisha (00:26.498)

Thanks, great to be here.

 

Matt (00:29.1)

Yeah, excited to have you on. And today we’re going to get into how generative AI, and more broadly, just the democratization of AI will fundamentally change user research and more broadly user experience. Uh, but let’s, let’s start off there. Like Nisha, why, why user research? Why this problem? What part of user research is broken or needs help? And how, how’s Coneaut looking to solve it? What gap do you see in the market right now?

 

Nisha (00:58.31)

Um, yeah, great question. So just real quick intro. I, uh, my background is data science. I’ve been in the industry for about a little over 10 years. Um, and my last company, I was working at a startup. I’ve been there for five years and was, uh, had built a tech team and, um, had come to a point where we were doing product development.

 

So with product development comes user research, right? Like to build a good product, you need to understand your users. That’s how you get to product market fit. That is how you really build what people are asking for versus what you’re building in your own head. So we did a lot of user research there. And I worked directly with, you know, like a small group that did the product development. One person was a UX designer and then engineer and a data scientist and myself.

 

Matt (01:28.745)

That’s right.

 

Nisha (01:49.978)

Um, and we did a bunch of user interviews and went through the process of distilling them and really pulling out insights. And it was tedious. It took a long time. It, um, it took a lot more time than I had expected, you know, just from my technical background. And, um, I was pretty overwhelmed with like the amount of information that we had to consume. Like, you know, you do the interviews first, record the interviews.

 

Matt (02:00.605)

Mm-hmm.

 

Matt (02:13.874)

Yeah.

 

Nisha (02:16.878)

transcribe them and by the time you sit down to really distill what’s what has been said like the important themes the important takeaways You have to pretty much go through the interviews again and go through every transcription, you know, like the basic Affinity mapping technique where you’re taking post-its and grouping themes and it just takes a long time Like it took, you know a week to two weeks because you don’t have like that set aside time to just dedicate to the distilling of research

 

Matt (02:32.49)

Mm-hmm.

 

Nisha (02:46.878)

And so what I found myself doing with my little team was just taking shortcuts, being like, okay, I remember this, I remember that, and being like, and then internally thinking this isn’t the right way to do this. I’m 100% biasing my findings by thinking, hearing the things that I really wanted to hear, obviously, that’s just human nature. So what actually happened is that I had a…

 

Matt (03:06.374)

Yeah.

 

Nisha (03:15.726)

project come up where there was like some kind of commitment to do 20 interviews in a period of two weeks and then distill the research. And I was like, this is insane. Like from my experience with research, I was like, this is a crazy requirement. And I, and I thought like there must be some tool, like there must be some AI platform that does this. Like we, you know, we’re at the age where this should be available. So I started Googling for it and I couldn’t find anything. I was like, this is insane.

 

Matt (03:25.062)

Oh wow.

 

Nisha (03:46.042)

So I called my friend at the time, my coworker, and now my co-founder, one of my two co-founders, and I was like, hey dude, we should build this. We can do it together. Called my third co-founder and we all talked about it and all agreed that it was a huge pain point of not being able to synthesize research in a speedy amount of time. And then also just that unbiased synthesis.

 

So that’s how this came about, honestly. It’s just from personal pain points, which I think is a great way to build a product because you’ve actually experienced it and you’re like, I wanna use this to solve my problems.

 

Matt (04:25.712)

Yeah, that’s a great explanation. And you’re bringing me back to my product days where we would do user research interviews and I would always schedule like an hour after the user interview to like debrief, go through it again. And it’s like, you know, that’s a two hour block there. And then to your point, you got to synthesize the whole thing. You forget stuff, you mentioned bias, but there’s also recency bias where I’m gonna remember the most recent interview more so than the other one. And then you have like for us, we would have these Miro boards.

 

Nisha (04:42.347)

Yeah.

 

Nisha (04:51.446)

Exactly.

 

Nisha (04:55.979)

Yeah.

 

Matt (04:56.132)

were just huge with all these insights and it’s like you’re trying to connect the dots. It’s it can get messy so like I can I can feel that pain. It’s it’s uh bringing back some memories from those days.

 

Nisha (05:00.128)

Yeah.

 

Nisha (05:08.254)

Yeah, exactly. 100%. It’s just like, how do we and then and so like, just to continue on, it was just like all like this journey has been quite serendipitous. Honestly, I ran into my upstairs neighbor, and she now also works for Coneo and are with us and she was a user researcher and I told her the idea and she was like, Oh my god, like this is gonna make my job so much easier. Right. And I and like, I like

 

Matt (05:19.102)

Mm-hmm.

 

Matt (05:29.626)

Oh, perfect.

 

Matt (05:35.004)

Yeah.

 

Nisha (05:37.31)

I’ll stop there, but I just want to like touch on that as well because it’s not like oh my god It’s gonna take over my job. It was more like this is gonna make my job so much easier

 

Matt (05:46.256)

Yeah, and I love the point too, like you’re hitting on the pain points of the speed element, but there’s also the quality piece with the bias. So there’s some core like value points you’re starting to hit on. But I was digging through your LinkedIn and your CEO, James, I’ll mispronounce his last name, but he had this like interesting quote that was out there, a survey by user interviews and said, UX researchers were the most tentative.

 

Nisha (06:06.146)

Prisha.

 

Matt (06:16.44)

of all roles to use AI with 44% saying they’ve tried to use AI, but avoid it in their research. But by comparison, CX professionals at 5%, data scientists 9%, product managers 21%. What do you think is the reason behind that? Why are user researchers in particular less likely to adopt this technology that could potentially make things easier for them?

 

Nisha (06:44.542)

I mean, honestly, I think it all boils down to like fear of the unknown. Um, if you look at like 9%, right? Data scientists are 9%. Like we, most data scientists understand exactly what’s going on at the bottom level. Right? Like it’s, we’re, it’s mathematical. There’s no like magic. It, there’s a lot of, um, inference, um, based on similar words and.

 

Matt (06:58.237)

Yeah.

 

Matt (07:02.931)

Mm-hmm.

 

Mm-hmm.

 

Nisha (07:09.75)

Um, and words transformed into number numeric representations, and that’s where like it all stems from. So I think like the number one thing is fear of the unknown. And, and then it just goes into like, I don’t want this to take away my job. Like it’s not do it. And then like, so then I feel like P I would get on the defensive of saying like AI cannot do my job the way I’m like better than me, it’s not going to replace me, so I don’t trust it. Um, I think instead, like where we could go with this is.

 

Matt (07:16.195)

Mm-hmm.

 

Matt (07:31.589)

Yeah.

 

Nisha (07:37.758)

AI is augmenting my job. Like I can actually focus on the important pieces versus like the tedious nature of things that I could actually like bring to the forefront using a tool that does what I would be doing over a week or two weeks in a matter of minutes, right? And then I can spend the time taking those insights and making more inferences and pulling more information out of it.

 

Matt (07:41.428)

Hmm.

 

Nisha (08:06.002)

I can also speed my research cycles up. So I think that like that fear, like we’ve heard it, we do our own user research with Conote and I think it’s just like what’s going on under there. Like it’s a black box. And I think that like the way that I would talk to people who had that fear is that it’s not a black box. It’s just, it’s like something that I can help explain and walk through. I think that would just get boring though because it’s super technical. But.

 

Matt (08:19.217)

Mm-hmm.

 

Nisha (08:37.34)

It is all related to similarities and semantic understanding, and AI is also not here to take your job. I will end with that.

 

Matt (08:47.46)

Yeah, and that’s an interesting theme we’ve had across several episodes we’ve done lately, is there is that fear of the unknown, that fear that it’s going to take my job, that it’s going to replace me. But this idea of a co-pilot, it’s enhancing my skills, it’s making me better, is a theme we’ve continued to hear. I was chatting the other day with, and I’m trying to define the episode, it was Brennan at

 

I think maybe that’s episode 15. We’ll see where it is what it launches, but it’s hyper context They’re solving this tool for one-on-ones and performance reviews and they the same kind of idea with HR was like, you know AI cannot solve this for me. There’s no way but what was interesting is like with the latest, you know Just craze over the past year, you know chat GBT and all that kind of stuff

 

They were able to play around with it and get a sense of how things can work. And it kind of opened up their minds a bit. I don’t know. Has some of that happened with user researchers as of late where we’ve had this crazy hype cycle with generative AI, where people see some of the power with it, because I think with user research, it’s, it’s so qualitative. I think that’s one of the big hiccups there as well. It’s like, you know, this is, this is qualitative stuff. It’s not ones and zeros.

 

Nisha (10:04.448)

Yeah.

 

Matt (10:04.764)

But with generative AI, it adds that kind of semantic piece to it, to your point.

 

Nisha (10:09.982)

Yeah, yeah, no, I think that there is a growing acceptance and, you know, like, people want to use this when they start seeing the way that generative AI can augment their research versus takeover. I think people are more accepting. Like, I think we just actually spoke to someone recently that’s getting on the platform at a large corporation, and they were a little skeptical at first and then

 

Matt (10:37.053)

Mm-hmm.

 

Nisha (10:37.43)

we introduced Conote as it gets you 85% of the way, right? It’s not doing all the research. It’s just getting you to a point, jumping off point where then you can take those findings and build your own insights. And that helped her feel better. She was like, oh, okay. So it’s not like just giving me this output. It’s more so like giving me stepping stones to get to that output. And I think when put like that, researchers seem to be more open to using the tool.

 

Matt (10:41.372)

Yeah.

 

Matt (10:54.812)

Mm-hmm.

 

Nisha (11:05.506)

are using products like this, like any kind of generative AI products. You know, there are a couple out there in the market that seem to be getting some kind of traction. I can talk to those later. But like I do think that like it’s like and still in like the early adopter phase. Right. Like people are still like weary. And we have to show people at Conote that like the reasons why they should be using it. And I think that’s like, you know, like what we’re doing for that is building a lot of user education.

 

Matt (11:22.915)

Mm-hmm.

 

Nisha (11:35.45)

showing people how they can use the tool to augment their research and giving examples like within Conote of how you can do that.

 

Matt (11:43.568)

Yeah, it’s an interesting kind of product. And then getting into the marketing problem where they may be problem aware, but not really solution aware and trying to migrate them down that path. Let’s get into like kind of short term evolution about AI can impact user research versus like longer term. And in the short term, I’d be curious, and this may be even be like functionality within Codenote or stuff on y’all’s roadmap. Like what’s the short term view of how

 

generative AI or AI in general is helping the user research process? I mean, is it simply just churning through this long interview and it’s spitting out the insights? Like where do you see it today? And then like, what’s like the crazy, like, you know, utopian future of what it could be in the future.

 

Nisha (12:33.09)

So right now, the power of Conote lies within, you know, like we are actually moving pretty fast. We released our initial beta live June, July 18th, and we’ve already had a couple of releases since. The big powerful generative AI piece right now, so like I just wanted to take a step back. Like we, I don’t think Conote is 100% generative AI. We have layered models. We do use traditional machine learning,

 

Matt (12:45.959)

Mm-hmm.

 

Nisha (13:03.146)

like large language models. And I think to that extent, like there’s already like power there. And that’s why we call it an AI engine versus like just like gen AI, right? Um, and what we’re doing, like the big powerful piece right now is that you can upload hours of research, so multiple interviews, and then you can synthesize. So not only can you synthesize, you can transcribe the interviews and see the transcriptions, uh, see the diarization by speaker and then highlight key quotes.

 

Matt (13:12.596)

Mm-hmm.

 

Nisha (13:31.766)

You can then synthesize your interviews, and then under 10 minutes, you will get the key themes, and then the keywords within each theme, and those keywords directly relate to sentences within the transcripts. So let’s say like I get four themes. I can click into those. I can then see where each speaker, like if I had five interviews, I can see where each of those speakers said that, mentioned that theme. I can then click into detailed view, where I can actually hear.

 

Matt (13:38.025)

Mm-hmm.

 

Matt (13:53.768)

Hmm.

 

Nisha (14:01.302)

the speaker saying it so I can get sentiment. And I can also bookmark this and build a presentation that I can send out to a stakeholder that may be interested in some of the key quotes that were said over eight hours of interviews, which is usually, as you know, usually would take so much more time. So yeah, I’ll stop there. That is our current big bang of…

 

Matt (14:17.948)

Mm-hmm.

 

Nisha (14:27.762)

our AI engine and we definitely have some other plans ahead, but just wanted to stop for any questions.

 

Matt (14:34.832)

Yeah, it’s an interesting point too. You mentioned generative AI is the hyped up word right now, but machine learning, and you as a data scientist knows some of this stuff’s been around for a very long time. This is not necessarily a new thing, right? And there’s so much power just in machine learning and a lot of the things there as well. And I’m curious too.

 

Nisha (14:52.555)

Yes.

 

Matt (15:04.616)

It seems like every day there’s another gen AI product coming out there. Like, how do you see differentiating when, you know, I feel like a lot of this, the tools with AI have been a bit democratized to where people have access to these large language models. You kind of mentioned that’s not the only core point to your, your tool, but how do you build a moat when it’s so much easier now to integrate some of this technology into a tool? Like, how do y’all think about that?

 

Nisha (15:33.11)

I think we have to really think about the user, right? Like, sure, everyone can access these APIs and build and integrate them into their product. Are they actually thinking about the UI and the UX? Like that is a key piece of Conote. Like you wanna have, and as you know, like also being in product, like you want to have a really intuitive like journey when you get to an app, right?

 

Matt (15:36.968)

Mm-hmm.

 

Matt (15:47.06)

Hmm.

 

Matt (16:00.082)

Yeah.

 

Nisha (16:00.942)

Uh, so you could integrate an API and build like all the tech and be amazing and stacked and everything. And if you’re building a SaaS product and don’t have like an intuitive front end, people are just going to stop there. Like they’re not going to know how to use get from point A to point B and what co-note my, uh, so I have James Frisha as a CEO and co-founder. And then my third co-founder is Cameron Ridenour and he’s a, he’s chief design officer and so his background is UX. Right.

 

Matt (16:20.285)

Mm-hmm.

 

Matt (16:26.237)

Mm-hmm.

 

Nisha (16:28.914)

And so not only like we live and breathe these problems, we get in touch with people that live and breathe these problems. We have people that also work with Conote that do. And I think that our moat is a like that we’re not just simply consuming APIs, right? We have other pieces of infrastructure around them on the AI side that actually enhance and empower us to be a little ahead of

 

Matt (16:50.494)

Mm-hmm.

 

Nisha (16:56.298)

not a little, a lot ahead of some of these Gen.AI companies that are just simply consuming and using prompts for some of these APIs. And then secondly, just the fact that we have such a deep understanding of the user and are focusing on that when building our product, right? The experience, the interaction with the app. And if you’re listening to this and are curious, please go check out conote.ai. It is live and free and…

 

Matt (17:02.738)

Yeah.

 

Nisha (17:24.798)

Matt, I’m not sure if you’ve checked it out, but when you’re on the application, compared to many other competitors that we’ve checked out and tried out, there is a very simplistic flow to get to the pain point that we’re solving for, which is really being able to speed up your research process.

 

Matt (17:47.332)

Yeah, I think part of the benefit there is you’re very focused in on a particular type of user, which is user researchers, right? I think so many folks, and we see this with a lot of clients too, they’re trying to serve too many different people. And, and then you get into back to user experience. How can you build, you know, not simple, but just intuitive user experience when you’re trying to serve different groups, do you have even within user researchers?

 

a persona within that you’re targeting or is it user research is kind of the core? Is there a type of user research you’re even more granularly focused in on?

 

Nisha (18:23.438)

Um, maybe not a type of user research per se, but definitely a type of user researcher that is, um, you know, uh, interested in synthesizing multiple interviews and has a research cycle they can’t keep up with. Um, or potentially, you know, like where I’m trying, where we’re trying to drive people is the fact that research is more important than people give it. Like it takes so much time. The research cycles are longer than development cycles, right? Like

 

Matt (18:29.671)

Yeah.

 

Matt (18:46.454)

Mm-hmm. Yeah.

 

Nisha (18:52.146)

I like if I’m thinking about dev, I think of CI CD and DevOps in CI CD and just in general agile principles, a sprint is two weeks. There is in no way that like researchers think they can finish a research cycle in two weeks. However, with Cono, you could do a week of interviews and then synthesize and be done and ready with new with new findings for the next sprint. And I think that is a missing piece in the entire like end to end process.

 

I have tirelessly worked with development teams and like led engineering and data science teams. And the missing gap is that they don’t get the user, they don’t understand thoroughly like the user research part, right? Like they, it’s like a game of telephone, like 10 people have spoken to each other before the engineering team hears what they need to build. And they can get so in like, uh, you know, like just in deep in the rabbit hole of like, Hey, this is how we’re going to do it. Technically.

 

Matt (19:31.964)

Yeah.

 

Nisha (19:50.93)

and not be thinking of like the actual user problem. And that’s where I really want to like, that’s where Kono comes in, right? Kono gives you the ability to add continuous research to CICD. So like in my mind, it should be CRCICD. Like that should be instilled in the development process.

 

Matt (19:53.736)

Mm-hmm.

 

Matt (20:11.12)

No, I love that. And you’re speaking our language. When we talk about built right, we talk about building the right digital solution the right way. And building the right one, it’s a key element. It’s user research. And I love the concept you’re talking about, where it has to be continuous. And this is what we preach as well. Like so much of the time, it’s like, all right, let’s go do our research. All right, discovery’s done. Let’s go build the thing. But it has to be.

 

Nisha (20:26.4)

Yeah.

 

Matt (20:36.712)

built into the process. So I love that idea of it’s, you know, you think of CI, CD, same type of thing. You need that feedback loop built in as you evolve the product. It seems like y’all are kind of, you know, dogfooding it a bit by using the product yourself. I’m curious, like how much of the roadmap are y’all like defining as you use the product versus feedback from customers?

 

Nisha (20:47.842)

Right.

 

Nisha (21:01.294)

And we try to definitely take more of the customer’s feedback just because they’re using it as like as their customers. But like I do have to say when I like I listened to a podcast recently and I was like, started to listen to it and I was like, let me just put this through Coneo and see what happens. And it just was able to distill like the key points so fast and going back to like the roadmap ahead you were asking about in our next release actually October 13.

 

Matt (21:06.845)

Mm-hmm.

 

Matt (21:18.182)

Yeah.

 

Nisha (21:29.75)

There’s a really cool feature coming out that’s called action items. So now not only do you get themes like that have been synthesized during the process, but you actually get the items that to action on, right? So like this is what your users have talked about. These are the actions to take that came from us using it and from feedback. Like I think like I wouldn’t say 50 50. I’d want to say like more like 70 users 30 us if we had to put a ratio to it.

 

Matt (21:38.716)

Mm-hmm.

 

Nisha (21:56.138)

But I think we end up all seeing like the great thing is we I think we all end up like coming up with very similar pain points and One of the main pain points we heard is like this is great, but it doesn’t give me items still like where I need to go next So so I ran the initial like the first round of Kono interviews We did before user interviews we did before we had started building our product, right? We had just like a prototype

 

Matt (22:11.037)

Hmm

 

Nisha (22:23.418)

I ran those interviews on dev through the action items feature to see what the action items were. It actually gave me action items that were the features that we ended up building, which is crazy, right? It told us users want audio recordings, users want ability to integrate with Zoom and Google Meads. I think that’s…

 

Matt (22:39.812)

Wow. Yeah.

 

Nisha (22:52.502)

That’s like, I kind of got off on a tangent, but that’s what happens when I get excited. I think that’s something that we’ve heard from users and from we’ve also experienced that we’re really excited about. And then, yeah, like I think it’s cool that we get to use it as we do our process as well, because it definitely makes us realize like, what is like, you know, like sometimes you can just be drinking from the fire hose. Like you think of really cool ideas, but we use it and we’re like, this is annoying. We need to change it. Like we spot the little things too.

 

Matt (23:10.248)

Mm-hmm.

 

Matt (23:20.564)

Mm-hmm. Yeah.

 

Nisha (23:22.923)

So yeah, it’s a good mix.

 

Matt (23:25.252)

Yeah, that’s interesting. And, you know, getting into the, um, let’s get into like the future state, like, you know, way in the future, you know, where do you see the practice of user research going if things continue to evolve where AI is continuing to evolve? Like, is there a future where it’s not even a real person doing the interview? And like, do I, at some point in the future have a, an agent or a bot that’s, you know, collabing with somebody else? And.

 

Performing this research like you ever see a future where it looks like that Like what is where does your mind go when it starts to wander of where things could be way in the future?

 

Nisha (24:05.262)

I mean, I don’t know about like, yeah, like great point. And I think people like wonder about that. But like, for me, I think there’s like a degree to personal interaction. Like if you’re a bot interviewing me right now, I feel like, sure, maybe like in some years, there will be AI that’s able to replicate each of us very well. But I do think like that human to human interaction is important in being able to, you know, like what? 94% of…

 

cues or like communication is nonverbal, right? Like I think there’s a lot to process that’s outside of just like a conversation that I’m sure AI will be able to replicate, but I don’t know if I’m like, yes, like we want to make everything computer like, you know, that like in the age of AI and just take away the human element. I think more so like the way I see Coneaut evolving in the future is being able to scale across, right? Like not like becoming so.

 

Matt (24:59.879)

Hmm

 

Nisha (25:01.262)

I’m so focused on like automating the entire user research process, but being able to scale to all types of research. So like as like to be able to reach product market fit and to really understand our target audience, we want to focus on user researchers right now. But to be able to like scale, I think where we go is just redefining all types of research, right? Like how do we help in the political space? How do we help in academia? How do we go into like?

 

Matt (25:17.064)

Mm-hmm.

 

Matt (25:23.717)

Hmm.

 

Nisha (25:29.662)

specific types of research. And I think that’s where I see Kono moving. That’s where we’re going in the future. Not like, I don’t see us adding a component where we’re gonna build in AI bots that interview people. And so once again, that’s why I feel we’re not taking away anything. It’s more just like, let’s augment the cycles so that people can be more productive and be up to speed with development teams. Just like, I just read.

 

Matt (25:43.642)

Yeah.

 

Nisha (25:58.81)

someone posted today about copilot, like the code AI, right? And just telling engineers that copilot is something people should lean into. They can automate so much of what they’re currently doing, like some of the like tedious, like granular code writing that like you don’t necessarily need to spend as much time on and can focus on the bigger picture. I see that exact parallel to.

 

to co-note with user research.

 

Matt (26:29.636)

Yeah, that’s a great connection point there. We’re using Copilot a lot at Hatchworks, and it kind of just gets the mundane out of the way, so you can think about the bigger problems. But I want to pause here, like, for all the listeners, when you’re thinking about product strategy and product in general, the way Nisha and team are doing it is a perfect example. They’re solving a specific use case for a particular user and user researchers, but you can also see where her mind’s going in terms of like, Tans Gentle,

 

other areas where they could move into in the future, but you kind of build this, uh, you know, this base with user researchers first. And that allows you the opportunity, uh, to expand further out, but you got to do that first. So that’s a great way to think about it. Don’t try to boil the ocean, solve something specific first, but is there an area you mentioned a couple, is there one that you think is like, Oh, this is definitely where we’re going next.

 

uh, from you mentioned like the political side, like these different areas, is there one that excites you outside of just traditional kind of, you know, product technology solution, user research.

 

Nisha (27:39.555)

Yeah, I don’t think I can say that like there’s one, like I think there’s multiple, right? Like this, people have already been using Kono for marketing use cases. So I think that’s probably like the next place to really go, right? Like, hey, we want to distill all of these, uh, these interviews or these, uh, podcasts and find the key quotes. Um, and this is going to help us be able to like.

 

Matt (27:51.004)

Mm-hmm.

 

Nisha (28:00.746)

make our marketing campaigns faster, just being able to pull these quotes out and having people saying them. So I think that’s a place that we can really like either be used right now or expand to immediately. I think political campaigns could be really cool because as we’re coming up also into a big year, like I think just hearing a lot of like if people, you know, like if there’s campaign interviews, being able to distill those and, and then once again, like play clips depending on whoever we’re working with.

 

And then I think that academia is close to my heart and also a really great space to be able to use this. Like, let’s say you’re a master’s student working full time, which I was, and you have like multiple lectures, right? Like that you have gone to and then are recorded. Imagine being able to use Kono to upload these lectures and then to just be able to find the most important themes and use this to study.

 

Matt (28:41.5)

Yeah.

 

Nisha (28:55.87)

Like I think this basically is with some tweaks, of course, right? Like we’re like, once again, like you said, like we have focused on user research for a reason, um, and I see this being expanded into like a line of products potentially, or, you know, Kona academia, Kona marketing, that kind of thing, but, um, just imagine being able to like take your notes and be able to like have like an easy way to, to like search across like hours and hours of lectures.

 

Matt (29:10.563)

Mm-hmm.

 

Nisha (29:24.706)

That would have made my life so much easier, honestly, when I was doing my masters. So I just think, like, yeah, those are like some key areas that I’m excited to focus on. I don’t know if like one will come before the other. I think we still have to like really nail this initial product market fit and group down. But I think that there’s like, the exciting thing about Conote is I feel like there’s so much room to grow. And there’s like so, there’s so many things that I want to act on, which makes me feel excited about it.

 

Matt (29:54.764)

Nice. Well, I think that’s a good stopping point, Nisha. Thanks for being on the Built Right podcast and where can folks find you, find Connaught? What’s the best place to go?

 

Nisha (30:04.546)

You can email me at nisha.conote.ai. And you can also just check out Conote. It’s live, you get five free credits. Go test it out, email me, let me know what you think. So our website is conote.ai. And then from the website, you can log into the app. So it’ll take you straight there and it’s pretty easy. So yeah, we’d love to hear from you all.

 

Matt (30:31.888)

Awesome, great having you on Nisha, thank you.

 

Nisha (30:34.55)

Yeah, thanks Matt.