Go to Kustomer
Mayukh Bhaowal 45 min

From Salesforce Einstein to Conversational Mastery


In this engaging session, join Mayukh Bhaowal, Co-Founder and CEO of CueIn.ai and former product leader at Salesforce Einstein, as he explores the transformative power of AI in customer conversations. Discover how CueIn.ai leverages artificial intelligence to revolutionize chatbot effectiveness and conversation workflows, enhancing customer experience through innovative analytics and conversation mining. Drawing from his rich experience at Salesforce Einstein, Mayukh will reveal key strategies for optimizing conversational AI, offering insights into the future of AI-driven customer dialogues and their impact on business success.



0:00

(upbeat music)

0:02

- And welcome to our next session with my Duke.

0:09

My Duke from QIN.AI.

0:11

My Duke, how are you doing today?

0:13

- I'm doing great.

0:14

How are you doing?

0:15

I'm just looking forward to the holidays.

0:17

- Oh, I am too.

0:18

I am too.

0:20

It's cool just to do this during the,

0:21

during like right before everything's happening

0:23

after Black Friday.

0:25

But I will say I am excited to talk to you.

0:28

I think you had quite a journey.

0:30

I'm very curious as to ask you questions on that.

0:33

And then get into the details of your,

0:35

of your new company that you founded with your co-founder.

0:37

But first, I want to call out for the audience.

0:41

I know that you were part of Salesforce for,

0:44

I think it's over over four years

0:47

working on Salesforce Einstein.

0:48

Do I have that right?

0:49

- Absolutely right.

0:50

Yeah, almost four and a half, five years.

0:52

Yeah, even I forget how long it was.

0:54

- So can you, can you talk us through just that journey

0:58

and that experience for you?

1:00

And then obviously how that integrated

1:02

and led you to founding Q and that AI.

1:06

- Yeah, absolutely.

1:07

So I was on the product side on Salesforce Einstein.

1:10

So this is when the, by the way,

1:12

Einstein is like Salesforce's AI umbrella.

1:15

It's a branding and--

1:16

- Thank you for clarifying that.

1:17

- Yeah, under then there are many AI products.

1:19

So I was a part of the platform.

1:21

And so what it really means is, you know,

1:24

it Salesforce has all of these clouds like,

1:26

yeah, the service cloud,

1:27

they have sales cloud, marketing and commerce.

1:30

To me it feels like each of them are very big companies

1:32

within this huge giant company called Salesforce.

1:35

And each of them, and we're talking about

1:38

like five or six years back, right?

1:40

You know, and you know, they're, they were still AI then.

1:43

There was no large language models in general,

1:45

but AI was very prevalent.

1:47

And there was all of these opportunities

1:48

to build AI into enterprise applications.

1:51

And so, and if you think about many of the underlying

1:55

technologies that being built are kind of common across,

1:58

whether it's a sales use case or a marketing use case

2:00

or a service use case.

2:01

And the, and the theme of this platform was,

2:04

can you build those underlying building blocks

2:06

which will power sales, marketing and so on and so forth.

2:09

So that's the platform that I was working on.

2:12

What is most interesting and I know I talk about this a lot,

2:16

I almost felt like an investor because you kind of see

2:19

this wide variety of use cases across all of this different,

2:23

you know, verticals.

2:25

And the more important thing, you also see which of them

2:27

are experimental versus which one actually gets mass adoption.

2:31

So you kind of start sharpening that instinct.

2:33

Okay, these are just experimental.

2:35

It's a cool demo.

2:36

It's great.

2:37

It's fascinating.

2:38

But you know, there won't be a lot of enterprise traction

2:41

into it.

2:42

So that's, that's.

2:43

- And you're talking internally, like, well,

2:44

looking at the data and the clients that are utilizing the

2:48

- Exactly.

2:49

Because, you know, because we see that the sales was a lot

2:51

of customers and some of them are, you know,

2:54

would say not innovators, right?

2:56

And, you know, they adopt something when it's quite mature.

2:59

So you get to see which of those are mature,

3:01

which of them are making impact versus which ones are not.

3:04

And so that was kind of fascinating as a part of that

3:06

platform.

3:07

And we were mostly working on something called automated

3:10

machine learning.

3:11

So like, how do you automate the machine learning for,

3:14

you know, all of these different plows,

3:16

all of these different use cases?

3:17

For instance, in sales, leads scoring,

3:19

but you have all of these leads,

3:21

how do you know which leads are better

3:23

and are more likely to convert in an opportunity?

3:26

If you think about service, like,

3:28

there are common use cases, obviously bots is there,

3:30

but there are also use cases around

3:31

you have tickets and cases.

3:32

Like, can you classify those things based on priority?

3:35

Who should it be routed to, you know,

3:37

you know, in marketing, you can think about recommendations

3:40

related use cases.

3:42

And my second half, I kind of moved more

3:45

into conversational AI.

3:47

So I was heading our Einstein bots.

3:49

I was a part of the service cloud.

3:50

So I was working on conversational AI.

3:52

And also had influence on our agent assist

3:56

as well as some of the voice programs.

3:58

So that's kind of what my journey at Salesforce was.

4:02

Also got lucky enough to give one of the keynotes,

4:05

Einstein keynotes in Salesforce's

4:07

annual dreamforce conference.

4:09

Very different experience, a lot of stress,

4:12

as you can imagine, behind what goes on,

4:14

but it was really fascinating to be able to talk

4:16

about some of the things our team was building

4:18

to the community.

4:20

- I'm sure.

4:21

And I actually came across some of those

4:23

and I tell the audience, go ahead and search that as well too.

4:25

Really interesting to see.

4:27

So what was the, for you, what was the then moment

4:30

of like, okay, I wanna, I wanna, like, what did you see?

4:33

Maybe you didn't, but did you see something

4:35

at when you're at Salesforce, specifically at Einstein,

4:38

that led you to the creation of your own company?

4:41

- Yeah, absolutely.

4:42

I think there was, there was definitely seeds planted.

4:46

By the way, you know, as someone who at one time

4:49

wants to be an entrepreneur, you know,

4:51

you're always looking for things for a while, you know,

4:53

and just, and not always the timing is perfect.

4:58

But working on the conversational AI side,

5:01

and this was both pre-pandemic and during the pandemic

5:04

and post-pandemic.

5:05

So I saw that whole journey.

5:07

You know, conversation workflows was quite important.

5:12

Like what it means is like, whether virtual agents

5:15

are bots or even live chat with human agents

5:17

or even voice calls.

5:19

And more and more of, you know, the Salesforce's customers

5:23

were actually investing in conversational workflows.

5:27

Definitely a lot in bots, especially during the pandemic

5:29

because of the obvious reason.

5:31

And one of the things that I noticed is

5:33

there are a lot of tools out there

5:35

which helps you to build these workflows.

5:38

But when it comes to evaluating these workflows

5:42

or, you know, doing analytics on top of it,

5:44

and I'm not talking about high-level analytics,

5:46

really understanding deep what's happening

5:48

in the conversations where things are breaking,

5:50

where users are getting frustrated,

5:52

there were not a lot of tools there.

5:54

And the problem itself was quite difficult.

5:57

And so that was one of the key insights that I had.

6:01

Like, you know, the observability, the mining,

6:03

how do you evaluate was quite important.

6:05

I think in a more succinct way

6:07

or a more emotional way to put it is like all of the CX teams

6:10

they are building all of this workflows,

6:11

they're building, designing their bots

6:13

or they're, you know, designing their knowledge bases.

6:16

But many times they didn't have confidence

6:18

in what they're building.

6:19

So how do we help them to gain,

6:21

hey, what your build works?

6:23

It helps customers, they make them happy

6:26

to solve their problem.

6:27

So that may be in one way what we're trying to solve.

6:30

How do we make the CX teams feel more confident

6:33

in what they're building?

6:34

- Yeah.

6:35

And at least for myself and the audience,

6:38

can you give maybe an in-depth definition

6:42

of what you talk about when you say workflows?

6:46

Because you talked about the chat and I get that,

6:48

but is there an overarching understanding

6:50

when you're talking about workflows for a country?

6:52

- Yeah. So any of these like,

6:54

if you think about a chat bot, like let's say,

6:57

if you talk about a commerce scenario,

6:59

someone says that, hey, they want to return something.

7:01

Like what are the steps involved to actually do the return

7:05

and how is it shaped in your conversations?

7:09

Maybe they're asking for the email address,

7:11

maybe they're asking for the order number

7:13

and maybe they ask which country they're from

7:15

and where they're bought it from.

7:17

And based upon that, the steps to return might be different.

7:20

Some cases you might have to go into a store

7:23

and some cases they might be able to do a return online.

7:26

Similarly, there can be for various different use cases,

7:30

the workflows can be also on the live chat side.

7:32

Like when it goes to human agent, how are they solving it?

7:35

Let's say we take a use case of a financial services.

7:39

Maybe they have an additional charge on the credit card

7:41

that they don't recognize.

7:42

So how do the live human agent solve that problem?

7:46

Again, this whole thing is happening not through a UI,

7:48

it's happening back and forth

7:50

through a conversation or a phone call.

7:51

- And I want to just ask you one-off question on this

7:57

to get your opinion.

7:59

As you're talking about this,

8:00

how I try to identify this a little bit is on the phone.

8:05

And specifically, like we've had now for years and years

8:08

and years, IBR's Interactive Forest Response System.

8:12

- Yes.

8:13

- And there's workflows on that.

8:16

And I recall a while ago in my career building

8:20

those as well too.

8:22

Is this brand new territory or can you analyze,

8:27

were people analyzing IBRs and now there's just no way

8:30

to understand this or is this two completely different things

8:33

that we're talking about?

8:34

- It's related but there has been a shift.

8:37

Like, of course there are ways to analyze IBRs

8:40

but more and more conversations are going,

8:44

there's voice bots which are not IBR driven.

8:46

Also the actual human conversations like,

8:49

sure you go through the steps maybe

8:51

and then you're actually going to a human.

8:53

And how is that conversation planning out?

8:56

Are, how are the humans interacting to address their queries?

8:59

Are they interacting well?

9:01

Are they even understanding?

9:02

In some cases it might be that the industry is shifting

9:07

or the products are shifting

9:08

and the humans don't have the right knowledge

9:10

of their fingertips to be able to resolve it.

9:13

And how do you figure out which of those areas

9:15

where they're not able to resolve?

9:17

Also underlying there is so much of wealth of product knowledge

9:20

that you can leverage.

9:22

For instance, like maybe going back to the example of return,

9:25

like you understand people are coming to return stuff

9:28

but what every company wants to know why.

9:30

So what is the root cause underneath it?

9:32

It's like, and some of them are kind of surprising.

9:34

It's not always that they are coming for return.

9:37

You would think that, oh, sizing issue for e-commerce

9:40

or something like that.

9:40

But sometimes people are coming,

9:43

asking questions about return

9:44

because the return policy is not clear.

9:47

Or maybe the return slip was not included in it.

9:51

Or maybe they went to the store,

9:52

they didn't get the return.

9:53

So it's not about just returning the product

9:56

but also the process and where some of those bottlenecks are.

9:59

And you can imagine there are hundreds and thousands

10:01

of conversations happening across,

10:04

sometimes channels, different modalities, maybe voice, chat,

10:07

and bot.

10:08

How do you aggregate all of those and see

10:11

what are the top root causes around the returns

10:13

where my company can focus on to improve the CX?

10:16

And that is what I'm going to argue for.

10:18

- I think that's fascinating because I think right now

10:20

if you look at just the market,

10:22

building and creating, let's just use the example of bots,

10:27

seems to be table stakes at this point.

10:29

And it's really easy to do that now.

10:31

But the talk of what you're mentioning of iterating

10:36

and how to iterate isn't talked about much at all.

10:39

It's just about getting it up there,

10:40

but not like, all right, well, how do you improve on that?

10:43

So walk us through now, qin.ai.

10:48

So I know you gave kind of the premise

10:51

and the theory behind it.

10:52

Like, talk to us about the platform itself,

10:54

what have you built and what is it

10:57

that you sell to?

10:59

I know we talked about mid-market enterprise companies.

11:02

- Yeah, so we are building a co-pilot

11:04

for customer experience teams,

11:06

mainly focusing on unifying the support data

11:10

across modalities channels or even vendors.

11:12

Some of these companies are using different vendors

11:15

and bringing them all together

11:17

and providing, they have lots of questions

11:20

about that conversational data that's going on.

11:23

And we analyze and provide answer to those questions.

11:26

And on the main problems,

11:28

the three key problems that we're helping to solve

11:30

is help them scale their CX so that it does better.

11:35

And some of the problems that they have faced

11:37

is like multiple channels and conversations that's happening.

11:41

How do they connect them together?

11:43

A second is larger volume of conversations.

11:45

Like, just imagine even listening to one call recording

11:48

or reading through a chat.

11:49

It's very hard, takes time consuming.

11:52

It's also, I've done it is like, after you do a few,

11:55

you feel like, I don't want to do this job.

11:57

Like, who's going to go read through all of these things?

11:59

It's just, it's very, very hard.

12:02

And so when you have hundreds and thousands

12:05

and sometimes millions of those conversations,

12:07

how do you analyze that?

12:08

Most companies do a sample from time to time.

12:11

It's not very comprehensive.

12:13

And the more, and another thing that's coming up is,

12:15

how can you do that in real time?

12:16

For instance, like, we just talked about Black Friday.

12:19

And we have scenarios where certain coupons

12:24

or discounts were given out.

12:25

But when the customers were trying to add

12:28

that coupon code in the cart, it somehow didn't work.

12:31

And, you know, and so there were two,

12:34

this is a real example.

12:36

And there was a spike in customer support requests

12:40

around they were trying to get price adjustments

12:42

because the coupon code didn't work.

12:44

Or many of them are trying to cancel their order.

12:46

And you want to know this in real time

12:49

because every hour you wait, you're actually losing revenue.

12:53

It's not even customer support, you're losing revenue.

12:55

So the sooner you come to know that there is something broken,

12:57

the sooner you would be able to fix.

12:59

So it's more like real time trends and anomalies

13:01

is another thing that we are looking into

13:04

and offering to our customers.

13:06

- So then I assume that a lot of questions

13:10

that companies need to ask,

13:13

like are there, like are there tablets?

13:16

Are there a common question, right?

13:18

Like you get to a point of, all right,

13:20

this company, when you're looking at all the data,

13:23

do you need to ask this, this, and this?

13:24

Or is every company unique from your perspective?

13:28

And how do you help out that, right?

13:30

Because it's like now you got access all the data,

13:33

you're saying to me that now you can ask questions

13:35

about anything, about products, about discounts.

13:39

Like where do you start?

13:41

Is I guess my question to you?

13:44

- Yeah, I think it's a very nuanced question.

13:46

It's a great question.

13:48

In certain industries, there are overlaps

13:52

around the kinds of questions that are asked.

13:55

And in certain industries, there is less and less overlaps,

13:58

more snowflake in nature, because their business is very unique.

14:02

And so, but what is interesting is the underlying technology

14:08

to understand language and what users are asking.

14:11

So we don't have any notion of, our AI model doesn't know

14:15

that this is an e-commerce return use case

14:17

versus this is a financial services,

14:20

and a credit card chart that they wanna refund.

14:22

It just looks into those conversations

14:25

and unsupervised identifies those intense inquiries

14:28

that pops up.

14:29

And it's really a factor of the volume,

14:32

if more and more volume of conversation

14:35

are asking that specific query or intent would bubble up.

14:38

So that taxonomy is almost like iteratively is built

14:42

and doesn't need to come from the customer themselves.

14:46

Having said that if they already have a taxonomy built

14:49

and they can influence that.

14:50

So there's ways to see that model.

14:52

So that's why we are kind of excited

14:54

because the technology works.

14:55

So we have customers in e-commerce, in health and benefits

15:00

and travel and hospitality and financial services.

15:03

And we're not doing extra additional work.

15:06

It's the technology automatically figures out

15:08

and are able to answer the question.

15:10

- Yeah, it does.

15:13

I guess like just to go one step deeper into that

15:15

for the audience, is it that from your opinion

15:18

working with your clients so far is that they have been

15:22

trying to figure out these answers about,

15:26

what are the discounts being used

15:27

or how effective is our chatbot.

15:29

And now you have provided the technology

15:31

that for them to do that.

15:32

Or is the kind of technology there first

15:35

and you're trying to educate the clients of like,

15:37

okay, here are some questions now

15:40

that you can start asking your business.

15:43

- I love that question.

15:44

It's actually both.

15:44

So they wanted to know like,

15:48

so this is going back to Salesforce.

15:50

They implemented the chatbot.

15:51

They knew first few five use case,

15:53

almost every company knows what are the top five

15:55

or six use cases.

15:57

But they wanna know how it's doing.

15:58

Is it breaking apart?

16:00

Like even even those use cases,

16:02

they're deviations.

16:02

Like maybe the bot gave an answer,

16:04

but it didn't satisfy the customer.

16:07

And they wanna know what is the percentage of cases

16:09

where the customer was not satisfied.

16:11

And when, and generally what happens

16:13

is they get escalated to a human agent.

16:14

So how did the human agent satisfy it?

16:16

And can we learn from that diff?

16:18

Like what the bot said and what the agents said

16:20

and inform or improve the design of the bot.

16:22

They always wanted to know it.

16:24

I don't think the technology was there

16:26

to do that very well,

16:27

especially deep into the conversations.

16:30

So they did more high level.

16:33

Look at these are the areas where the answers

16:35

are not given, they know that,

16:36

but they don't know how to improve.

16:38

What's the next step?

16:39

What's the action to take?

16:41

The second part is because of a large language model

16:44

and the advancement in technology to understand

16:48

some of the use cases,

16:49

which was very hard and has to be done manually,

16:52

can now be automated.

16:54

That's so there's new opportunities there.

16:56

Like this root cause analysis that I talked about,

16:58

like, I mean, it's quite interesting

17:01

because sometimes the root cause

17:02

of why the customer reached out to,

17:04

you cannot analyze that by listening to the customer.

17:08

You have to listen to the solution

17:09

because the agent have more insights.

17:11

They're looking into their account,

17:12

they're finding some issues and said,

17:13

oh, this is the problem.

17:15

And then we see that problem is repeating.

17:17

And from the agent sites,

17:19

you have to also look at the agent site

17:21

and understand that and it can be anywhere in the conversation,

17:23

not in the first, not in the middle.

17:25

It can be multiple user utterances

17:27

to be able to understand the entire conversation

17:29

and surface that that's the technology

17:31

which didn't even exist before.

17:33

And related to that, you know,

17:35

another common thing is QA of agent interaction.

17:37

That's a very common stuff in the industry.

17:39

And almost every company has some kind of a rubric,

17:42

you know, for how they actually do that.

17:44

And it's very manual.

17:45

They have rubrics like if the agent showed empathy,

17:48

did they personalize the conversation,

17:51

did they provide next steps in the conversation,

17:53

did they read what the user said before

17:56

and then didn't force the user to restate that.

17:59

I mean, you just have to look into the conversation

18:01

and score in each of these factors.

18:03

And there was no AI to do that before, you know,

18:06

before like generative AI and large language models.

18:09

So some of those use cases

18:10

and now becoming more prevalent can be done more easily.

18:13

So I would say the answer is both things that was there.

18:16

They wanted to know it was not possible.

18:18

And there are some of these things,

18:19

newer areas now which can be done

18:21

which was not being able to be done before.

18:23

- So I want to go one level deeper

18:29

and get a specific from you, which is if let's say,

18:33

and we don't have to use an actual client of yours,

18:35

but like direct to consumer businesses,

18:38

after Black Friday, let's say they have a large collection

18:43

of data, phone, email, chat,

18:46

and they're using QN.AI, you know,

18:50

but what would be one or two examples of questions

18:53

that they should be asking right now

18:55

if they had all the access to the data and the access to the QN.

18:58

- Yeah, I think some of the questions

19:00

they should be asking is like,

19:01

I mean, if you're talking about the CX

19:03

and the customer support

19:04

and they have some kind of a bot in place,

19:07

how is the bot able to self-serve and what percentage

19:11

and the cases where they are not self-serve,

19:14

what are those topics, other newer topics

19:16

where it's not able to self-serve.

19:18

And when they're not doing it,

19:21

how are the human age and self-serving that

19:23

so that it can inform the design of the bot

19:26

or optimize the bot design.

19:27

So in the future when the next Black Friday comes

19:30

or throughout the year, it's able to self-serve more of those.

19:34

So that's kind of another thing.

19:37

The other thing which I talked about,

19:38

especially in Black Friday,

19:39

understanding this trending stuff and anomalies that's happening

19:43

and trying to being able to identify it sooner,

19:45

I think would be quite critical.

19:47

And that's what they're trying to do.

19:49

Like let's say I gave the example of the coupon code

19:51

which is very common during holidays

19:53

and it didn't work.

19:54

And as a result of which the customers were complaining

19:57

about it or canceling their order even worse

19:59

and they wanna know it sooner.

20:01

And that's the kind of question,

20:02

what are the fiction in their buying process

20:05

and can they know it immediately in real time

20:08

so they can address that?

20:09

- I love that and thanks for that.

20:10

I think that's really insightful.

20:12

What, I have to give you another broad question

20:16

but it's because we're not specific

20:18

but it's like in your opinion,

20:21

what should a company optimize for

20:24

in your example with self-serve, right?

20:26

Like it should a company be optimizing for like,

20:30

okay, yeah, if we weren't able to self-serve 40%

20:34

of those questions, how do we get that to 20%?

20:36

Is it okay to be at 50, 50% getting the agent involved?

20:40

I guess at the end of the day, like is it, what's the answer?

20:45

And I know it's not every company is the same here

20:48

but I always like to ask the question,

20:50

like what are we optimizing for?

20:53

And how would you respond to that?

20:54

- Yeah, I think, I think a good way to think about it.

20:58

Like we should put ourselves in the,

21:00

I mean, we're all consumers at the end of the day.

21:01

Like it's not that we don't want to interact with the bot

21:05

or we always love to interact with human

21:07

and it goes vice versa.

21:09

It's like we want to get, we have a problem,

21:12

customers are a problem and we want to get resolved.

21:14

If the bot is able to resolve it faster, quicker

21:18

without me waiting for 10 minutes on a call,

21:20

I would love that.

21:21

But there are some complex topics, right?

21:23

Which I know a virtual agent cannot solve it

21:26

or at least not today.

21:28

For that, don't make me go through that process.

21:31

I would like to talk to a human

21:32

who can actually solve some of those complex stuff.

21:35

So I don't think the goal is like,

21:37

I don't think it will ever be like,

21:39

you know, 100 person self service.

21:41

There are cases where the humans are involved.

21:44

So, but the easier ones and some of the more nuanced ones

21:48

can be solved by the bot

21:49

and the more complex and difficult ones

21:51

can go to the human.

21:52

But again, when it goes to the human,

21:54

do the humans have, that human agents have

21:56

the right tools of their finger step?

21:58

Like for example, one of the companies

21:59

that we are working with, they're in the benefits management,

22:02

during the open enrollment time,

22:04

they have like huge, you know,

22:06

guides around benefits and people asking about the deductibles

22:09

and all of those complex scenarios.

22:11

And it would be great for the human to, you know,

22:13

ask this question to that knowledge base

22:15

and get the answer quickly to be able to support.

22:17

So that's also part of the optimization

22:19

for the human agents.

22:21

One other thing I wanted to say,

22:22

which is I see there is, it's catching up more,

22:25

is this concept of an angel concept

22:29

of customer satisfaction,

22:30

but it's more like a real-term inferred

22:33

customer satisfaction because the problem

22:35

with the angel method is like a survey based,

22:39

it happens later on, it has low take rates, it's biased,

22:42

and there's so many other things.

22:44

But the most annoying part is like,

22:46

if you read the customer conversation,

22:47

you would have a very good guess how satisfied they were.

22:51

But yet we do this annoying thing of asking later

22:54

on like, how did it go?

22:55

So the idea is, can we just go to the conversation,

22:58

take the, see the steps that the bot or the human agent did,

23:01

and kind of the conversation they had to infer

23:04

what the customer satisfaction is,

23:05

which is a leading indicator.

23:07

And the reason I brought it up,

23:09

that has to be balanced with the self-serve.

23:11

Like there are ways you can self-serve like 100 person,

23:14

you can just tell the customer to go away,

23:17

but your CSEL would be really done.

23:18

I'm talking about an extreme case,

23:19

but you have to balance the self-serve

23:22

as well as the customer satisfaction that's happening.

23:25

Because if you drop the customer satisfaction,

23:27

I don't think that's the greatest.

23:28

So that's another very, very interesting thing.

23:31

And it's actually even more important

23:32

with large language models to build that card rails,

23:34

because the bot can go off the rails.

23:37

And you want a real-time interception

23:39

to understand the customer satisfaction

23:41

is actually declining.

23:43

And maybe it's a good time to stop the bot

23:45

and go to a human agent.

23:47

- Yeah, I love your discussion

23:50

about the real-time CSAS scores.

23:52

And where I'm drawing the line,

23:54

and maybe the audience, that is a B2B SAS,

23:57

you can understand as well too,

23:58

which is that if you think of a tool like GONG,

24:01

for instance, and GONG's now at a point of,

24:04

it can tell you a certain prospect or deal,

24:07

the likelihood of a closing based on all the data and inputs.

24:11

And it's really that same concept of like,

24:13

okay, in real time, this customer,

24:15

what are the likelihood that they're like

24:16

actually 100% happier, actually,

24:18

they're never by again in real time.

24:20

I think that's fascinating.

24:21

Maybe one other kind of area and topic I'd like to ask you

24:25

about is, again, this data collection,

24:29

because all of this is based on every single customer

24:33

touch point that you have,

24:34

whether that be on the phone,

24:35

whether that be through the chat,

24:37

you can just website visits,

24:39

whether they're clicking out

24:39

and really understanding everything there is

24:41

about the customer.

24:42

I guess from your point of view,

24:44

has there been a shift in that?

24:47

Like in the last few, maybe just year, I'd say,

24:49

or even with the founding of QN.ai,

24:52

where are companies thinking of data

24:55

differently now than before?

24:57

And how important is that now moving forward?

25:00

- I would think, I mean, I don't know if you would agree.

25:03

I mean, more and more companies and execs and decision makers

25:07

are realizing data is more and more important.

25:10

And being able to collect and store

25:12

and have a clean data is quite crucial.

25:15

Also, other things that we are also seeing is there are more,

25:21

depending upon the business,

25:22

because we have some clients in different

25:24

kind of verticals and domains,

25:26

the security and the privacy of the data

25:29

because there are consumer rights involved.

25:31

So there are things happening there

25:34

where you need to make sure that when you're analyzing,

25:37

the PII and PHI is de-identified there.

25:41

And one subtle difference,

25:43

I'm not talking about reduction,

25:44

reduction is removing it.

25:46

Like it's because that hinders understanding

25:48

the conversation is de-identified.

25:50

We don't need to know what the email is.

25:51

We just need to know there was an email there.

25:53

That's good enough.

25:54

And that part is quite critical,

25:57

I think, especially in some of the financial services

25:59

or healthcare domains,

26:01

because I don't know if you have ever come across a bot

26:04

where it says, don't share your credit card information.

26:06

You know, they just give this disclaimer

26:08

and just hope that no one shares it,

26:10

but someone might share it.

26:11

And so being able to protect that,

26:14

I think is quite important as well, I would say.

26:18

The other thing is being able to,

26:23

you know, this is also a very common problem.

26:25

I think there are CDPs built around this

26:27

when you have these conversations

26:28

across different channels, especially vendors.

26:30

Sometimes it's actually quite often

26:32

it's very hard to connect them together.

26:34

So I think that realization is becoming more and more important.

26:38

Like is there some way, there's some ID

26:40

which can connect them,

26:42

the more you can do that,

26:43

the more you'll be able to stitch them together

26:46

and be able to analyze.

26:47

So I think that's also quite important.

26:49

- Yeah, I would agree with that.

26:51

I guess the last question I have for you,

26:54

it's more about looking,

26:56

it's hard to say looking forward

26:57

because I feel like we have so much

27:00

in the present right now,

27:01

but maybe just looking into next year,

27:03

which is less than a month away.

27:06

You know, what would you be excited about

27:09

specifically at qn.ai

27:12

that you can share with us?

27:15

- Yeah, I think some of these things I talked about

27:18

is definitely not done.

27:19

We are constantly iterating on it.

27:22

This anomaly and trend piece I talked about,

27:24

it's a very complex problem.

27:27

It's almost like, you know, Google trends, right?

27:29

You go to Google trends, you search for a thing,

27:30

you see how it looks.

27:31

Imagine the same thing for your customer support data,

27:34

right there at your fingertips

27:35

and you can see everything.

27:37

So we have made some progress there,

27:39

but there's so much more to do there.

27:41

Similar thing when I was talking about this,

27:43

understanding and analyzing conversations,

27:45

I'll give another analogy, maps analogy, right?

27:48

So think about when many of these conversations

27:51

that design, you know, let's say the user wants

27:53

to get a solution to a problem.

27:55

So the analogy is they want to go to path A to B

27:58

and the world that is designed things,

28:00

the map looks like there's one straight path from A to B

28:03

and that's what the map is all about.

28:04

But maps are not like that.

28:06

You know, users are taking so many different paths

28:09

to get from point A to B

28:10

and they're a bottleneck or higher traffic

28:12

and some of those paths.

28:13

So that analogy to the world of conversation and CX,

28:17

what are those different paths taken,

28:18

where are those bottoms being able to surface

28:20

that visualization, I think also key

28:23

to be able to solve some of this problem.

28:26

The last thing I would say we are working on it

28:28

is kind of a natural language BI.

28:31

It's like a chat typically for all your customer support data.

28:35

So, but it's very different.

28:37

It's not like, you know, you put all of those conversation data,

28:39

you ask the question and you magically get the answer.

28:42

It doesn't happen.

28:44

You have to do some of those analysis before

28:47

and then you do, you know, natural language to SQL

28:50

and you start getting those response

28:51

but it becomes very powerful because you can just go into this

28:54

and say, tell me what are the top contact patterns

28:57

and you fix one, tell me what are the top root causes around it

29:00

and then you can say, okay,

29:01

how are the agents giving responses to that?

29:03

Do you see this workflow right there

29:06

without going through dashboards and dashboards

29:08

and creating custom and, you know,

29:10

so that natural language BI can be another very interesting thing

29:13

that we are working through.

29:15

It's not straightforward as it sounds or, you know,

29:18

oh, you just put chat jippity for enterprise and it works.

29:20

It doesn't work that way.

29:22

But that's another area that we are looking into.

29:24

Probably there's some opportunities in the future.

29:26

- Yeah, super interesting.

29:28

And I think for those listening that are in operations,

29:33

there's a lot to look forward to and a lot of,

29:36

I'll be honest, a lot of responsibility come in their way

29:39

from like analyzing data that they could never do that before.

29:42

So that's really interesting and I feel like you're providing

29:45

so much more, again, opportunity to shift and change

29:49

in real time to customer support organizations.

29:53

So that's awesome.

29:54

Bye, Yuka.

29:55

Bye, Yuka.

29:56

I'm sorry.

29:56

Again, thank you so much for your time.

29:59

I wish you the best in 2024,

30:02

beyond for Q and dot AI for those listening.

30:04

Please check out what they have to offer.

30:07

And that's it.

30:09

Again, appreciate your time.

30:11

- Thank you so much Brian and for the audience.

30:13

Like we have integration with customer.

30:15

It's on the customer website live.

30:16

So do check us out.

30:18

- And what else do you have integrations with?

30:21

I just wanna make sure we're showing on by everyone here too.

30:24

- So we integrate with a bunch of vendors from Salesforce,

30:28

to Genesis, to Amazon.

30:30

There's a whole list.

30:32

So because that's one of the key things.

30:34

Like we made the integration quite easy

30:37

and once you do that,

30:38

and many customers use different vendors

30:39

and we were kind of agnostic.

30:42

So yeah, so it's also on our website.

30:43

If you go to Q and dot AI,

30:44

you will see all of those integrations as well.

30:46

- Amazing.

30:47

Thank you for pointing that out.

30:49

And thanks all.

30:50

(upbeat music)

30:52

(upbeat music)

30:55

(upbeat music)

30:57

(upbeat music)

31:20

(upbeat music)

31:24

(upbeat music)

31:34

(upbeat music)

31:39

(upbeat music)

31:44

(upbeat music)

31:51

(upbeat music)

32:01

(upbeat music)

32:10

(upbeat music)

32:17

(upbeat music)

32:27

(upbeat music)

32:32

(upbeat music)

32:37

(upbeat music)

32:44

(upbeat music)

32:54

(upbeat music)

32:59

(upbeat music)

33:04

(upbeat music)

33:11

(upbeat music)

33:21

(upbeat music)

33:26

(upbeat music)

33:31

(upbeat music)

33:38

(upbeat music)

33:48

(upbeat music)

33:53

(upbeat music)

33:58

(upbeat music)

34:05

(upbeat music)

34:20

(upbeat music)

34:25

(upbeat music)

34:32

(upbeat music)

34:47

(upbeat music)

34:52

(upbeat music)

34:59

(upbeat music)

35:14

(upbeat music)

35:19

(upbeat music)

35:26

(upbeat music)

35:45

(upbeat music)

35:52

(upbeat music)

36:11

(upbeat music)

36:18

(upbeat music)

36:38

(upbeat music)

36:58

(upbeat music)

37:18

(upbeat music)

37:38

(upbeat music)

37:58

(upbeat music)

38:18

(upbeat music)

38:38

(upbeat music)

38:58

(upbeat music)

39:18

(upbeat music)

39:38

(upbeat music)

39:58

(upbeat music)

40:18

(upbeat music)

40:38

(upbeat music)

40:58

(upbeat music)

41:18

(upbeat music)

41:38

(upbeat music)

41:58

(upbeat music)

42:18

(upbeat music)

42:38

(upbeat music)

42:58

(upbeat music)

43:18

(upbeat music)

43:38

(upbeat music)

43:58

(upbeat music)

44:18

(upbeat music)

44:38

(upbeat music)

44:58

(upbeat music)

45:18

(upbeat music)