Artwork

Innehåll tillhandahållet av Himakara Pieris. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Himakara Pieris eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

AI in Automotive Retail with Ankit Raheja from CDK Global

32:47
 
Dela
 

Manage episode 378349590 series 3458395
Innehåll tillhandahållet av Himakara Pieris. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Himakara Pieris eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

I’m excited to share this conversation with Ankit Raheja. Ankit is a lead product manager focused on AI, data, and APIs at CDK Global. During this conversation, Ankit discussed the AI product development lifecycle, metrics for AI products, and how product managers could start their AI journey with small steps.

Links

Ankit on LinkedIn

CDK AI Survey: What Automotive Leaders Think About Artificial Intelligence

DeepLearning.AI: Start or Advance Your Career in AI

Transcript

[00:00:00] Himakara Pieris: Welcome to the smart product show. My guest today is Ankit Raheja. , to start things off, could you tell us a bit about, , your current role and how you're using AI, as a product manager?

[00:00:11] Ankit Raheja: Absolutely. Currently I am a lead product manager at CDK Global.

[00:00:20] Ankit Raheja: CDK Global is the largest Car dealership software company in the United States, we power more than 15, 000 dealership location. So, so that's why it is one of the most biggest force but which you haven't, which you haven't heard about because you do not interact with it directly, but I'll tell you 15, 000 plus dealerships are using it.

[00:00:53] Ankit Raheja: And. We are embedded across the whole user journey, starting from [00:01:00] the front office. Front office is when you go to a dealership for purchasing a car and getting all the different warranties and insurance options. Second is the fixed operations. The fixed operations is the car services that you get done when you go to a dealership.

[00:01:21] Ankit Raheja: Then there is some back office. You can imagine dealerships need to take care of like inventory of the parts. And the vehicles and there are many more other things and last but not the least these dealerships need Massive infrastructures to run so we are embedded across all these four Parts of the user journey, the next question that you mentioned about Like where exactly we have used ai so so I have been in the ai space since 2013.

[00:01:55] Ankit Raheja: It was a combination of data and AI. In past, we [00:02:00] have used AI across companies such as Cisco, Visa, and state compensation insurance fund. We have worked number one in the customer support. Use cases, then we have worked in market segmentation, use cases that visa and finally healthcare fraud detection, use cases that state compensation fund currently where I'm using AI at CDK, we are leveraging it across multiple ecosystems.

[00:02:33] Ankit Raheja: Number one is we are trying to match potential customers with potential cars. So it's like a propensity to buy a model. Second is predictive service. Basically what we're trying to do is that when you go to a car dealership and, and sometimes you do not know what services. additional services that you need.

[00:02:56] Ankit Raheja: And, and, you know, you are a busy professional, [00:03:00] you have so many other things to worry about. So we want these car dealership employees to be able to recommend you additional services that you may have not even thought about. So that's the second use case. Last but not the least. We are also exploring benchmarking use cases where something like dealers like you, for example, you have one dealership group and you don't know whether, how are you doing?

[00:03:24] Ankit Raheja: Like, are you doing well? You need to back up on few of the things. So, so that's where the benchmarking comes in. So these are the current use cases. And as you know chatbots are becoming more and more prevalent now. So, yeah, but right now just want to focus on the current use cases and the use cases that I've worked on previously.

[00:03:47] Himakara Pieris: Great. And before this you had an interesting use case with chatbots at Cisco as

[00:03:54] Ankit Raheja: well. Absolutely. Yeah. I can definitely talk to you a little bit about the [00:04:00] chatbot at Cisco. The, let me tell you some... context around the issue. Basically Cisco has lot of switching products, router products, basically all B2B products.

[00:04:17] Ankit Raheja: And some of them as you can imagine will become defective and, and you want to return those products. However, Cisco identified that a lot of these products do not need to be returned. Some of them are avoidable returns. So technically we were trying to solve an avoidable returns problems. This existing way to solve that was that these customers would reach out to the technical assistance center engineers.

[00:04:55] Ankit Raheja: who are technical customer service engineers, if [00:05:00] in, in more layman terms, and they troubleshoot these problems from them and then decide whether this product should be returned or not. We realize. AI could be a really big help to these technical assistant center engineers because you can basically have a lot of skill.

[00:05:25] Ankit Raheja: Number two it's like an intern. AI is like an intern, which is trying to learn new, new things. So as it learns more and more, it will get, become better and it will become a lot more helpful for them. And sometimes these technical assistance engineers are not available, that's where this chatbot can come in.

[00:05:43] Ankit Raheja: So, multiple use cases, why we thought AI made sense, and, and we really had great impact by leveraging AI for this use cases.

[00:05:56] Himakara Pieris: So Cisco and CDK, these are very large companies [00:06:00] with a ton of use cases.

[00:06:02] Ankit Raheja: How did you decide

[00:06:04] Himakara Pieris: the use cases and when to use AI, when to not use AI and what kind of framework do you use for that?

[00:06:12] Ankit Raheja: Absolutely. I'll have a spicy take on this. The first rule of AI is not to use AI in the first place when you're in the discovery stage. You should be able to understand how. A human can do this work better for example, I'll give you two examples, autonomous driving car, what could happen right now, instead of autonomous driving car, what's happening, you're the one who are driving, so you're the one looking around, hey, here's the signal, here's this pedestrian, here's this road, so you should be able to do that first.

[00:06:51] Ankit Raheja: Another thing for chatbot, right? So we had this technical assistance engineers who were doing it. So, so this is a very, [00:07:00] the framework is pretty simple and universal. AI is only one of the ways that may solve this customer's problem while ensuring its need to drive business value. We have seen so many times right now, as you've seen with the chat GPT hype, more and more products are coming out, but the time will tell how many of them will really be retained.

[00:07:25] Ankit Raheja: Right now there's big hype, but eventually retention is the key. So to think about this, I have a very simple framework and this is overused a lot, but there's a bit nuance to it. The number one is user value. Are you providing real value to customers? Why should these customers hire your solution? Are you helping them with their jobs to be done?

[00:07:52] Ankit Raheja: So that's the first thing. That's the first constraint that you'll look at. Number two, which is very important. You may not even get [00:08:00] funding if you don't have a good answer for it. That's your business goals. Just because your c e O said, Hey, I see the chatbot chat gpt is doing really well. You need to really start from the vision.

[00:08:12] Ankit Raheja: Go to the strategy, goes to the goals and come with your KPIs. And what are your KPIs? Do you want to acquire more users? Number two, you want to retain more users. Number three, you need to monetize these user more by upscale or cross sell. Or last but not the least you need to drive more word of mouth, net promoter score.

[00:08:33] Ankit Raheja: So that's the second thing, the business goals. The last constraint that we need to think about is the The technical component of it, like how comfortable are you? Okay. Using a predictive solution versus a deterministic solution. Sometimes, if you can imagine [00:09:00] there like you can make a machine go through and read one medical chart for cancer.

[00:09:10] Ankit Raheja: Would you give all the... Onus on the machine to make a call. I would not say that. So you still need to have a human in loop. However, in some cases like recommendation engine for Amazon, there are so many different permutation combination that can, can, can come with the long tail option. So that's where the the AI makes sense.

[00:09:33] Ankit Raheja: So it all depends from case to case basis. If you want me to go more into detail, I can definitely go more into detail about the AI use cases.

[00:09:41] Himakara Pieris: generally speaking, start with with a focus on customer value and then map it to your business goal and strategy and have clear KPIs. And make sure that your proposed solution could deliver on those KPIs. Absolutely.

[00:09:59] Ankit Raheja: So,

[00:09:59] Himakara Pieris: how [00:10:00] would you compare, let's say, more of a deterministic solution? So, if you have a, I'm sure at all these companies, you have a very large and long backlog of things that you could do.

[00:10:10] Himakara Pieris: Does this mean that AI solutions are possibly going to sink to the bottom of the backlog? because they are relatively more difficult to quantify or the, you know, the, the time to value might be not as quick as more of a deterministic solution.

[00:10:28] Ankit Raheja: Sure. So it all depends on the use cases as. We have made this possible in this world of building and launching something fast and getting feedback.

[00:10:42] Ankit Raheja: You can always build a minimum viable product. What I call it is minimum viable algorithm. You can always build a simple model. For example, if you think about LLM use cases. [00:11:00] You can always, there are still, there are so many other machine learning libraries which are already available that you can use to prove out the value quickly.

[00:11:10] Ankit Raheja: And then you can get a buy in from your leadership. It's all about influencing without authority and how will it drive value? And then like after you get a little bit of buy in, you start putting more and more bodies on this problem. And so it's a little bit different from the normal product development life cycle.

[00:11:33] Ankit Raheja: There's another product development life cycle called AI product development life cycle, which makes sense a lot here compared to the normal other products.

[00:11:43] Himakara Pieris: Very cool. Let's talk a bit about the AI product development life cycle. And also on the back of that, I'd love to pick your brain a bit about designing and building AI MVPs as

[00:11:56] Ankit Raheja: well.

[00:11:57] Ankit Raheja: Absolutely. Yeah. So. I [00:12:00] think the good example would be to talk about my experience at Cisco. Yeah, I think let's share the case study here. I think that will make a lot more sense here. For AI product life cycle, the number thing and, and it is universal. That's why we should really start from first principle.

[00:12:21] Ankit Raheja: Problem identification, whether this problem makes sense for us to solve, whether we have the the, the value that we're able to get from it, and third, it aligns with the strategy. For example Amazon is not going to start sending rockets in the, in the, in the universe, it will be the other product groups.

[00:12:47] Ankit Raheja: So it all depends, like, how does it align? So number one is. Problem identification. Number two is since it's a data product is about your data sourcing and data preparation strategy. [00:13:00] You can start small with taking some sample data and see how it's generating value. So but as you know 80 percent of time goes into cleaning the data.

[00:13:15] Ankit Raheja: And 20 percent time gets into building the model. So, so data sourcing and data preparation is the number number two step. The third is the model building. You build the model, you launch the product the small product in the market, or you do a beta test depends on you. And then you do. Tracking on top of it.

[00:13:36] Ankit Raheja: So as you start doing tracking on it, you'll get more and more idea. You will iterate over it and either change the problem, change the data set or change the model. So it all depends. So at Cisco, basically we had a triple track agile process. I had a track which was working on discovery of [00:14:00] different machine learning models, because at Cisco, when we started with our accuracy was not that great. So it was a little bit lower than the human benchmark. So you can imagine that there was some hesitation for these technical assistance support engineers to adopt. this product wholeheartedly.

[00:14:25] Ankit Raheja: So one team was working on discovery of the the, the new and latest models and how we can improve the accuracy. The second, the track was all about data acquisition. You live and die with this data. That's why you will see here. The big tech is spending so much time building their data mode. So. So the next track which can work in parallel is data acquisition.

[00:14:53] Ankit Raheja: They need to start sourcing the data. They need to start cleaning the data so that can be fed to a module. Last, but not the least [00:15:00] delivery of these models. It's not like building a model in the like in a Jupiter notebook. It's about deploying this mo model in production so that you can get feedback.

[00:15:13] Ankit Raheja: So there were three different tracks. If you have thought about a normal product development lifecycle, you'll be putting all of them in one track itself. And then you can imagine AI engineers, AI infrastructure, data cleansing. is not a cheap affair. That's why 90 percent of products fail because we are not thinking about setting our processes better to really drive quick value and have a quick iterative step.

[00:15:46] Himakara Pieris: It sounds very interesting. It sounds like you had them grouped under different sort of competencies as well from a team structure and organization standpoint because when you think about model discovery I'm thinking of ML engineers and Did [00:16:00] acquisition clean up data scientists and, and delivery off the model, MLOps and CICD folks.

[00:16:06] Himakara Pieris: So, so how did those three groups sort of collaborate in that kind of environment? Like, you know since these are three parallel tracks, I'm, I'm guessing the deliverables or the sprints are not necessarily aligned at all times because they might be making progress at different, different

[00:16:24] Ankit Raheja: speeds.

[00:16:25] Ankit Raheja: Perfect. Yeah. Thankfully I, and it all depends from companies to companies. So context, as you know, is the most important thing in industry. Like you cannot just use best practices or Facebook and apply in a startup. You can't even take some time, best practices of company like Google and put into Facebook.

[00:16:45] Ankit Raheja: They are so different. So thankfully. The way it worked really well in our favor at Cisco was there was a ceremony called Scrum of Scrums. We had one program manager who used to own [00:17:00] these three different tracks and we will have a weekly meeting where we, we talk about like what went well, what help we need, any, any blockers, et cetera, et cetera.

[00:17:10] Ankit Raheja: So, so that's why there was a sync up. At a regular cadence and so scrum of scrums that made sense like so that was more of a Cisco process but like if you're a startup sometime the same person is the mlops is done by the data analyst As well as the data center. It's the same person doing everything.

[00:17:31] Ankit Raheja: So it all depends.

[00:17:41] Himakara Pieris: I want to talk a bit about bringing these solutions more from a, like a go to market and distribution standpoint. Are you working with 15, 000 car dealerships now, right? How does that process look like? Do you do like, you know, incremental releases going out to these folks? Are they part of like, you know, product [00:18:00] discovery?

[00:18:01] Himakara Pieris: Could you talk a bit about that as well?

[00:18:04] Ankit Raheja: You know you've touched on Extremely important point and I'm realizing that industry is still in the, the discovery and development stage and we don't give a lot of weightage to the GTM, but you have seen the, the beauty of the GTM strategy that open AI had.

[00:18:29] Ankit Raheja: When they first of all launched the product really quickly, they already had a tie in with Expedia of the world and, and, and, and Instacart also. So they had their GTM ramped up really well. So, and for us also at CDK as a B2B SaaS giant, GTM is taken extremely seriously. So for a few of the products, what we have done, number one, we take help of these customers to get an early access.[00:19:00]

[00:19:00] Ankit Raheja: to do some kind of a co development with them. We did it for one of our product offering called Data Your Way. It's a data product, it's not an AIML product, but that's what we did. We got their feedback, and we launched the product within a few months after working with them. After the co development phase, next came for us, the beta stage.

[00:19:25] Ankit Raheja: There we expanded our sample set to around 15 dealer groups. There's a difference between dealerships and dealer groups. One dealership group can have multiple dealerships under them. So we worked with 15 dealerships for our beta stage. And finally, We launched our product after they being in beta for for, for a few months and now the product is in GA.

[00:19:52] Ankit Raheja: So it all depends. The same thing applies to AI ML products. Also, you, you co create with them give [00:20:00] them some like extra credits or, or give them on a discount so that they can help you decide because it's a skin in the game for them also, and then you can put in your proper telemetry in place.

[00:20:11] Ankit Raheja: And then you can expand and make it a lot more better. How do you

[00:20:15] Himakara Pieris: facilitate communication and collaboration during that process? What kind of metrics do you look at? What does the feedback loop look like?

[00:20:25] Ankit Raheja: Absolutely. How the first one to how to facilitate this conversation is like, I think it again, it all depends from company to company like depending on the size of the company I was spread across multiple products.

[00:20:42] Ankit Raheja: So there I leverage. My amazing customer success management team and customer sales team who really had a one to one relationship with these customers. I would also go into the meeting, but they will be the project manager. We'll have our spreadsheet [00:21:00] where we'll be talking about, Hey, these are the feedback that we got from the, from these customers.

[00:21:06] Ankit Raheja: These were the, the, the pluses, these were the deltas. And then we will be having these bi weekly meeting with these Select customers and we'll tell them that hey, this is something that we are working on to to keep them In communication metrics what we were tracking were that we had a really good funnel system that hey, we started first with 30 prospective beta dealers, dealership groups, thinking that many of them will be busy with so many other things.

[00:21:40] Ankit Raheja: Then we knew that this will number will go down. We wanted a critical mass of 15 plus. So we, we got it. We have seen some failures in some products. What has happened was that we only start with one or two dealership groups. I think that's a recipe for disaster because if you start with two or three, it's so obvious, but as you know, hindsight is always 20, [00:22:00] 20, like always start with a big group and, and expect that your customers have busy lives.

[00:22:05] Ankit Raheja: You are just embedded in part of their solutions, it all depends.

[00:22:12] Himakara Pieris: So what kind of like specific business and customer metrics do you track? And are they any different from your traditional SAS

[00:22:20] Ankit Raheja: products? Oh yeah, definitely. So that's why there's a nuance to these different metrics. So first of all, the first one remains exactly the same.

[00:22:33] Ankit Raheja: These are your business and customer metrics. To give you an example again for the Cisco number one was that like how many additional cases. That you're able to handle with a chatbot. Imagine what can happen is that I, the chatbot can come in and can try out some use cases some cases for you before sending it to a human being.

[00:22:59] Ankit Raheja: So like, [00:23:00] like how much uplift you can do with this. Number two sometimes you don't need to staff so many additional customer service engineers. So how much it's helping reduce in, in the personal costs. Another business metric could be that how much reduction in the avoidable returns that you are able to get through that.

[00:23:24] Ankit Raheja: So these are the two high level metrics. The third metric, again, the third business metric is your net promoter score. Alexa does a great job in it. Like sometimes what happens is Alexa would be asking you, how much would you rate this response from one to five? So we thought, why don't we learn from Alexa and start leveraging it so that we had a net promoter score also going in.

[00:23:47] Ankit Raheja: So these were just the business metrics. What changes In the AI ML space is the next thing that shows up is your, your algorithmic metrics, [00:24:00] which is like when you're trying to do like the, the modeling, you need to worry about like, Hey, What's the simple metrics like accuracy, precision, recall, it all depends like, and it depends from you, what do you care the most about?

[00:24:16] Ankit Raheja: Do you care more about accuracy versus whether you care about precision, whether you care about recall? What I have seen is that like sometimes people forget about it that which metric is the most important from models and you can solve the wrong thing. For example, for some places, like when you're doing cancer detection you need to be really careful about the false positive metric.

[00:24:44] Ankit Raheja: Sorry false negative metric. The false negative metric is if somebody has cancer and if you don't tell them that they have cancer That's a bigger problem than they're going into the the cancer treatment and getting a chemo They will not be happy but hey, [00:25:00] they're still alive. But if somebody takes a lot of time to get the cancer detection done I think that's the bigger problem.

[00:25:08] Ankit Raheja: So we have to be really careful about this, which model metric we should optimize for. Last but not the least is the ML infrastructure metrics. Basically sometimes you need to worry about latency, right? Sometimes do you want a fast model? Or you want an accurate model, because there are two different things.

[00:25:35] Ankit Raheja: Do you want your model to be available on edge or you want it to be on cloud? So you need to worry about the infrastructure metrics also. To summarize three kinds of metrics, business and customer metrics, algorithmic metrics, and ML infrastructure and production based metrics.

[00:25:56] Ankit Raheja: So

[00:25:56] Himakara Pieris: I presume the first and last buckets customer metrics [00:26:00] and infrastructure metrics are more visible, but algorithmic metrics tend to be less so. And sometimes they reflect in the other two. Do you keep the customer, especially early adopters or like, you know, development partners in loop about algorithmic metrics?

[00:26:17] Himakara Pieris: And what is the level of conversation there as you share this with the business groups?

[00:26:22] Ankit Raheja: Oh, perfect. Again Cisco was an amazing playbook that I can talk to you about like, so and this came so often number one rule like I follow this product management guru and I'll give a shout out to him.

[00:26:39] Ankit Raheja: His name is Shreyas Doshi. He says you are not dev complete. unless you have your telemetry built in. So the step one for making sure that you are GA, in fact, we have it at CDK also, that you need to build your telemetry in place. If you have your telemetry built in place, things become a lot more [00:27:00] easier later.

[00:27:01] Ankit Raheja: I'll give you an example for Cisco, a chatbot, right? A chatbot is there and it is giving a prediction of, of, of saying that, hey, this product should be returned. We will be having this weekly meeting with our developers and we'll tell the developers that, hey, this is where the chatbot is telling that we should return this product.

[00:27:28] Ankit Raheja: The human, when he's coming in and he's checking it, they're saying, no, it should not be written. So we had a Tableau dashboard that was capturing the model scores as well as the, the human recommendation and whenever there was a delta. We will be surfacing that to the development team, they they will take it, they will retrain the model and then like that's how we will keep on iterating over it.

[00:27:58] Ankit Raheja: So, but the first thing is if you [00:28:00] don't have telemetry in place. You will be thinking you'll be regretting why I don't have that built in first place. So step one in ML model, have your telemetry built in place, store it in some database, have some kind of a way to surface those results because otherwise it's just opinions.

[00:28:19] Himakara Pieris: That's really good advice. On the back of that, are there any other advice you'd like to offer to product leaders, product managers who are interested in getting into AI and, and building AI-powered solutions?

[00:28:34] Ankit Raheja: Yeah definitely. The, the first thing that I will request our product leaders is like I'm all about being as transparent and as inclusive as possible here.

[00:28:49] Ankit Raheja: Do not think machine learning is some ivory tower. It was. Thought of SM I've retired earlier. And now you have seen with [00:29:00] the amazing GTM strategy and the product strategy of open AI, like so many folks are jumping on the bad wagon and I do not want the product leaders to be left behind. I want them to have it as one of the items.

[00:29:15] Ankit Raheja: in their toolkit, like, but this is not your front and center. The first thing that you need to understand is how and where can you use AI? I'll give you a few examples here. Something like a SaaS product. Are you in a SaaS product layer? For example like Amazon Recommendation Engine, Coda AI, Notion AI.

[00:29:38] Ankit Raheja: That's like a SaaS product adding AI to their products. Second one could be algorithmic products. Anthropic, OpenAI, Facebook Llama, like that's the second place where you can think about like where you can use AI. Third is [00:30:00] AI infrastructure software companies. There's a company called Cube. That's one example.

[00:30:06] Ankit Raheja: The fourth is AI infrastructure hardware companies. As you can imagine, NVIDIA's stock is at an all time high. Again, it's a trillion dollar company, all because of its GPUs and then with AI and with Snowflake partnership, yeah, it's taken to the next level. So, so think first of all, which What space do you operate in or where do you think that you can use AI?

[00:30:32] Ankit Raheja: Number two, start small by leveraging data because at least you can have some kind of proof of concept to decide whether AI makes sense. Because what's happening now in the industry is that we are able to, number one use a lot of annotation services, which can annotate data for you.[00:31:00]

[00:31:00] Ankit Raheja: Number two, you can also create your data synthetically. There are a lot of use cases there. Last but not least, check out these amazing websites. One is artificialintelligentnews. com. You can look at Andrew Ng's batch at Deep Learning AI. He really comes up with amazing content and last but not the least, you can always ask chat GPT, which is trained on billions of parameters to, to tell you that what could be some use cases.

[00:31:30] Ankit Raheja: At least it's the first step.

[00:31:32] Himakara Pieris: very much for coming on the podcast today. Is there anything else you'd like to share as well?

[00:31:40] Ankit Raheja: The only thing that I would probably like to share is about if you are a car dealership company and so CDK has shared an amazing AI survey and I will provide you Imkara in the case notes.

[00:31:59] Ankit Raheja: So if [00:32:00] any car dealerships is listening to it, they can always look at that link and yeah, and we are excited to talk to you.

  continue reading

15 episoder

Artwork
iconDela
 
Manage episode 378349590 series 3458395
Innehåll tillhandahållet av Himakara Pieris. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Himakara Pieris eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

I’m excited to share this conversation with Ankit Raheja. Ankit is a lead product manager focused on AI, data, and APIs at CDK Global. During this conversation, Ankit discussed the AI product development lifecycle, metrics for AI products, and how product managers could start their AI journey with small steps.

Links

Ankit on LinkedIn

CDK AI Survey: What Automotive Leaders Think About Artificial Intelligence

DeepLearning.AI: Start or Advance Your Career in AI

Transcript

[00:00:00] Himakara Pieris: Welcome to the smart product show. My guest today is Ankit Raheja. , to start things off, could you tell us a bit about, , your current role and how you're using AI, as a product manager?

[00:00:11] Ankit Raheja: Absolutely. Currently I am a lead product manager at CDK Global.

[00:00:20] Ankit Raheja: CDK Global is the largest Car dealership software company in the United States, we power more than 15, 000 dealership location. So, so that's why it is one of the most biggest force but which you haven't, which you haven't heard about because you do not interact with it directly, but I'll tell you 15, 000 plus dealerships are using it.

[00:00:53] Ankit Raheja: And. We are embedded across the whole user journey, starting from [00:01:00] the front office. Front office is when you go to a dealership for purchasing a car and getting all the different warranties and insurance options. Second is the fixed operations. The fixed operations is the car services that you get done when you go to a dealership.

[00:01:21] Ankit Raheja: Then there is some back office. You can imagine dealerships need to take care of like inventory of the parts. And the vehicles and there are many more other things and last but not the least these dealerships need Massive infrastructures to run so we are embedded across all these four Parts of the user journey, the next question that you mentioned about Like where exactly we have used ai so so I have been in the ai space since 2013.

[00:01:55] Ankit Raheja: It was a combination of data and AI. In past, we [00:02:00] have used AI across companies such as Cisco, Visa, and state compensation insurance fund. We have worked number one in the customer support. Use cases, then we have worked in market segmentation, use cases that visa and finally healthcare fraud detection, use cases that state compensation fund currently where I'm using AI at CDK, we are leveraging it across multiple ecosystems.

[00:02:33] Ankit Raheja: Number one is we are trying to match potential customers with potential cars. So it's like a propensity to buy a model. Second is predictive service. Basically what we're trying to do is that when you go to a car dealership and, and sometimes you do not know what services. additional services that you need.

[00:02:56] Ankit Raheja: And, and, you know, you are a busy professional, [00:03:00] you have so many other things to worry about. So we want these car dealership employees to be able to recommend you additional services that you may have not even thought about. So that's the second use case. Last but not the least. We are also exploring benchmarking use cases where something like dealers like you, for example, you have one dealership group and you don't know whether, how are you doing?

[00:03:24] Ankit Raheja: Like, are you doing well? You need to back up on few of the things. So, so that's where the benchmarking comes in. So these are the current use cases. And as you know chatbots are becoming more and more prevalent now. So, yeah, but right now just want to focus on the current use cases and the use cases that I've worked on previously.

[00:03:47] Himakara Pieris: Great. And before this you had an interesting use case with chatbots at Cisco as

[00:03:54] Ankit Raheja: well. Absolutely. Yeah. I can definitely talk to you a little bit about the [00:04:00] chatbot at Cisco. The, let me tell you some... context around the issue. Basically Cisco has lot of switching products, router products, basically all B2B products.

[00:04:17] Ankit Raheja: And some of them as you can imagine will become defective and, and you want to return those products. However, Cisco identified that a lot of these products do not need to be returned. Some of them are avoidable returns. So technically we were trying to solve an avoidable returns problems. This existing way to solve that was that these customers would reach out to the technical assistance center engineers.

[00:04:55] Ankit Raheja: who are technical customer service engineers, if [00:05:00] in, in more layman terms, and they troubleshoot these problems from them and then decide whether this product should be returned or not. We realize. AI could be a really big help to these technical assistant center engineers because you can basically have a lot of skill.

[00:05:25] Ankit Raheja: Number two it's like an intern. AI is like an intern, which is trying to learn new, new things. So as it learns more and more, it will get, become better and it will become a lot more helpful for them. And sometimes these technical assistance engineers are not available, that's where this chatbot can come in.

[00:05:43] Ankit Raheja: So, multiple use cases, why we thought AI made sense, and, and we really had great impact by leveraging AI for this use cases.

[00:05:56] Himakara Pieris: So Cisco and CDK, these are very large companies [00:06:00] with a ton of use cases.

[00:06:02] Ankit Raheja: How did you decide

[00:06:04] Himakara Pieris: the use cases and when to use AI, when to not use AI and what kind of framework do you use for that?

[00:06:12] Ankit Raheja: Absolutely. I'll have a spicy take on this. The first rule of AI is not to use AI in the first place when you're in the discovery stage. You should be able to understand how. A human can do this work better for example, I'll give you two examples, autonomous driving car, what could happen right now, instead of autonomous driving car, what's happening, you're the one who are driving, so you're the one looking around, hey, here's the signal, here's this pedestrian, here's this road, so you should be able to do that first.

[00:06:51] Ankit Raheja: Another thing for chatbot, right? So we had this technical assistance engineers who were doing it. So, so this is a very, [00:07:00] the framework is pretty simple and universal. AI is only one of the ways that may solve this customer's problem while ensuring its need to drive business value. We have seen so many times right now, as you've seen with the chat GPT hype, more and more products are coming out, but the time will tell how many of them will really be retained.

[00:07:25] Ankit Raheja: Right now there's big hype, but eventually retention is the key. So to think about this, I have a very simple framework and this is overused a lot, but there's a bit nuance to it. The number one is user value. Are you providing real value to customers? Why should these customers hire your solution? Are you helping them with their jobs to be done?

[00:07:52] Ankit Raheja: So that's the first thing. That's the first constraint that you'll look at. Number two, which is very important. You may not even get [00:08:00] funding if you don't have a good answer for it. That's your business goals. Just because your c e O said, Hey, I see the chatbot chat gpt is doing really well. You need to really start from the vision.

[00:08:12] Ankit Raheja: Go to the strategy, goes to the goals and come with your KPIs. And what are your KPIs? Do you want to acquire more users? Number two, you want to retain more users. Number three, you need to monetize these user more by upscale or cross sell. Or last but not the least you need to drive more word of mouth, net promoter score.

[00:08:33] Ankit Raheja: So that's the second thing, the business goals. The last constraint that we need to think about is the The technical component of it, like how comfortable are you? Okay. Using a predictive solution versus a deterministic solution. Sometimes, if you can imagine [00:09:00] there like you can make a machine go through and read one medical chart for cancer.

[00:09:10] Ankit Raheja: Would you give all the... Onus on the machine to make a call. I would not say that. So you still need to have a human in loop. However, in some cases like recommendation engine for Amazon, there are so many different permutation combination that can, can, can come with the long tail option. So that's where the the AI makes sense.

[00:09:33] Ankit Raheja: So it all depends from case to case basis. If you want me to go more into detail, I can definitely go more into detail about the AI use cases.

[00:09:41] Himakara Pieris: generally speaking, start with with a focus on customer value and then map it to your business goal and strategy and have clear KPIs. And make sure that your proposed solution could deliver on those KPIs. Absolutely.

[00:09:59] Ankit Raheja: So,

[00:09:59] Himakara Pieris: how [00:10:00] would you compare, let's say, more of a deterministic solution? So, if you have a, I'm sure at all these companies, you have a very large and long backlog of things that you could do.

[00:10:10] Himakara Pieris: Does this mean that AI solutions are possibly going to sink to the bottom of the backlog? because they are relatively more difficult to quantify or the, you know, the, the time to value might be not as quick as more of a deterministic solution.

[00:10:28] Ankit Raheja: Sure. So it all depends on the use cases as. We have made this possible in this world of building and launching something fast and getting feedback.

[00:10:42] Ankit Raheja: You can always build a minimum viable product. What I call it is minimum viable algorithm. You can always build a simple model. For example, if you think about LLM use cases. [00:11:00] You can always, there are still, there are so many other machine learning libraries which are already available that you can use to prove out the value quickly.

[00:11:10] Ankit Raheja: And then you can get a buy in from your leadership. It's all about influencing without authority and how will it drive value? And then like after you get a little bit of buy in, you start putting more and more bodies on this problem. And so it's a little bit different from the normal product development life cycle.

[00:11:33] Ankit Raheja: There's another product development life cycle called AI product development life cycle, which makes sense a lot here compared to the normal other products.

[00:11:43] Himakara Pieris: Very cool. Let's talk a bit about the AI product development life cycle. And also on the back of that, I'd love to pick your brain a bit about designing and building AI MVPs as

[00:11:56] Ankit Raheja: well.

[00:11:57] Ankit Raheja: Absolutely. Yeah. So. I [00:12:00] think the good example would be to talk about my experience at Cisco. Yeah, I think let's share the case study here. I think that will make a lot more sense here. For AI product life cycle, the number thing and, and it is universal. That's why we should really start from first principle.

[00:12:21] Ankit Raheja: Problem identification, whether this problem makes sense for us to solve, whether we have the the, the value that we're able to get from it, and third, it aligns with the strategy. For example Amazon is not going to start sending rockets in the, in the, in the universe, it will be the other product groups.

[00:12:47] Ankit Raheja: So it all depends, like, how does it align? So number one is. Problem identification. Number two is since it's a data product is about your data sourcing and data preparation strategy. [00:13:00] You can start small with taking some sample data and see how it's generating value. So but as you know 80 percent of time goes into cleaning the data.

[00:13:15] Ankit Raheja: And 20 percent time gets into building the model. So, so data sourcing and data preparation is the number number two step. The third is the model building. You build the model, you launch the product the small product in the market, or you do a beta test depends on you. And then you do. Tracking on top of it.

[00:13:36] Ankit Raheja: So as you start doing tracking on it, you'll get more and more idea. You will iterate over it and either change the problem, change the data set or change the model. So it all depends. So at Cisco, basically we had a triple track agile process. I had a track which was working on discovery of [00:14:00] different machine learning models, because at Cisco, when we started with our accuracy was not that great. So it was a little bit lower than the human benchmark. So you can imagine that there was some hesitation for these technical assistance support engineers to adopt. this product wholeheartedly.

[00:14:25] Ankit Raheja: So one team was working on discovery of the the, the new and latest models and how we can improve the accuracy. The second, the track was all about data acquisition. You live and die with this data. That's why you will see here. The big tech is spending so much time building their data mode. So. So the next track which can work in parallel is data acquisition.

[00:14:53] Ankit Raheja: They need to start sourcing the data. They need to start cleaning the data so that can be fed to a module. Last, but not the least [00:15:00] delivery of these models. It's not like building a model in the like in a Jupiter notebook. It's about deploying this mo model in production so that you can get feedback.

[00:15:13] Ankit Raheja: So there were three different tracks. If you have thought about a normal product development lifecycle, you'll be putting all of them in one track itself. And then you can imagine AI engineers, AI infrastructure, data cleansing. is not a cheap affair. That's why 90 percent of products fail because we are not thinking about setting our processes better to really drive quick value and have a quick iterative step.

[00:15:46] Himakara Pieris: It sounds very interesting. It sounds like you had them grouped under different sort of competencies as well from a team structure and organization standpoint because when you think about model discovery I'm thinking of ML engineers and Did [00:16:00] acquisition clean up data scientists and, and delivery off the model, MLOps and CICD folks.

[00:16:06] Himakara Pieris: So, so how did those three groups sort of collaborate in that kind of environment? Like, you know since these are three parallel tracks, I'm, I'm guessing the deliverables or the sprints are not necessarily aligned at all times because they might be making progress at different, different

[00:16:24] Ankit Raheja: speeds.

[00:16:25] Ankit Raheja: Perfect. Yeah. Thankfully I, and it all depends from companies to companies. So context, as you know, is the most important thing in industry. Like you cannot just use best practices or Facebook and apply in a startup. You can't even take some time, best practices of company like Google and put into Facebook.

[00:16:45] Ankit Raheja: They are so different. So thankfully. The way it worked really well in our favor at Cisco was there was a ceremony called Scrum of Scrums. We had one program manager who used to own [00:17:00] these three different tracks and we will have a weekly meeting where we, we talk about like what went well, what help we need, any, any blockers, et cetera, et cetera.

[00:17:10] Ankit Raheja: So, so that's why there was a sync up. At a regular cadence and so scrum of scrums that made sense like so that was more of a Cisco process but like if you're a startup sometime the same person is the mlops is done by the data analyst As well as the data center. It's the same person doing everything.

[00:17:31] Ankit Raheja: So it all depends.

[00:17:41] Himakara Pieris: I want to talk a bit about bringing these solutions more from a, like a go to market and distribution standpoint. Are you working with 15, 000 car dealerships now, right? How does that process look like? Do you do like, you know, incremental releases going out to these folks? Are they part of like, you know, product [00:18:00] discovery?

[00:18:01] Himakara Pieris: Could you talk a bit about that as well?

[00:18:04] Ankit Raheja: You know you've touched on Extremely important point and I'm realizing that industry is still in the, the discovery and development stage and we don't give a lot of weightage to the GTM, but you have seen the, the beauty of the GTM strategy that open AI had.

[00:18:29] Ankit Raheja: When they first of all launched the product really quickly, they already had a tie in with Expedia of the world and, and, and, and Instacart also. So they had their GTM ramped up really well. So, and for us also at CDK as a B2B SaaS giant, GTM is taken extremely seriously. So for a few of the products, what we have done, number one, we take help of these customers to get an early access.[00:19:00]

[00:19:00] Ankit Raheja: to do some kind of a co development with them. We did it for one of our product offering called Data Your Way. It's a data product, it's not an AIML product, but that's what we did. We got their feedback, and we launched the product within a few months after working with them. After the co development phase, next came for us, the beta stage.

[00:19:25] Ankit Raheja: There we expanded our sample set to around 15 dealer groups. There's a difference between dealerships and dealer groups. One dealership group can have multiple dealerships under them. So we worked with 15 dealerships for our beta stage. And finally, We launched our product after they being in beta for for, for a few months and now the product is in GA.

[00:19:52] Ankit Raheja: So it all depends. The same thing applies to AI ML products. Also, you, you co create with them give [00:20:00] them some like extra credits or, or give them on a discount so that they can help you decide because it's a skin in the game for them also, and then you can put in your proper telemetry in place.

[00:20:11] Ankit Raheja: And then you can expand and make it a lot more better. How do you

[00:20:15] Himakara Pieris: facilitate communication and collaboration during that process? What kind of metrics do you look at? What does the feedback loop look like?

[00:20:25] Ankit Raheja: Absolutely. How the first one to how to facilitate this conversation is like, I think it again, it all depends from company to company like depending on the size of the company I was spread across multiple products.

[00:20:42] Ankit Raheja: So there I leverage. My amazing customer success management team and customer sales team who really had a one to one relationship with these customers. I would also go into the meeting, but they will be the project manager. We'll have our spreadsheet [00:21:00] where we'll be talking about, Hey, these are the feedback that we got from the, from these customers.

[00:21:06] Ankit Raheja: These were the, the, the pluses, these were the deltas. And then we will be having these bi weekly meeting with these Select customers and we'll tell them that hey, this is something that we are working on to to keep them In communication metrics what we were tracking were that we had a really good funnel system that hey, we started first with 30 prospective beta dealers, dealership groups, thinking that many of them will be busy with so many other things.

[00:21:40] Ankit Raheja: Then we knew that this will number will go down. We wanted a critical mass of 15 plus. So we, we got it. We have seen some failures in some products. What has happened was that we only start with one or two dealership groups. I think that's a recipe for disaster because if you start with two or three, it's so obvious, but as you know, hindsight is always 20, [00:22:00] 20, like always start with a big group and, and expect that your customers have busy lives.

[00:22:05] Ankit Raheja: You are just embedded in part of their solutions, it all depends.

[00:22:12] Himakara Pieris: So what kind of like specific business and customer metrics do you track? And are they any different from your traditional SAS

[00:22:20] Ankit Raheja: products? Oh yeah, definitely. So that's why there's a nuance to these different metrics. So first of all, the first one remains exactly the same.

[00:22:33] Ankit Raheja: These are your business and customer metrics. To give you an example again for the Cisco number one was that like how many additional cases. That you're able to handle with a chatbot. Imagine what can happen is that I, the chatbot can come in and can try out some use cases some cases for you before sending it to a human being.

[00:22:59] Ankit Raheja: So like, [00:23:00] like how much uplift you can do with this. Number two sometimes you don't need to staff so many additional customer service engineers. So how much it's helping reduce in, in the personal costs. Another business metric could be that how much reduction in the avoidable returns that you are able to get through that.

[00:23:24] Ankit Raheja: So these are the two high level metrics. The third metric, again, the third business metric is your net promoter score. Alexa does a great job in it. Like sometimes what happens is Alexa would be asking you, how much would you rate this response from one to five? So we thought, why don't we learn from Alexa and start leveraging it so that we had a net promoter score also going in.

[00:23:47] Ankit Raheja: So these were just the business metrics. What changes In the AI ML space is the next thing that shows up is your, your algorithmic metrics, [00:24:00] which is like when you're trying to do like the, the modeling, you need to worry about like, Hey, What's the simple metrics like accuracy, precision, recall, it all depends like, and it depends from you, what do you care the most about?

[00:24:16] Ankit Raheja: Do you care more about accuracy versus whether you care about precision, whether you care about recall? What I have seen is that like sometimes people forget about it that which metric is the most important from models and you can solve the wrong thing. For example, for some places, like when you're doing cancer detection you need to be really careful about the false positive metric.

[00:24:44] Ankit Raheja: Sorry false negative metric. The false negative metric is if somebody has cancer and if you don't tell them that they have cancer That's a bigger problem than they're going into the the cancer treatment and getting a chemo They will not be happy but hey, [00:25:00] they're still alive. But if somebody takes a lot of time to get the cancer detection done I think that's the bigger problem.

[00:25:08] Ankit Raheja: So we have to be really careful about this, which model metric we should optimize for. Last but not the least is the ML infrastructure metrics. Basically sometimes you need to worry about latency, right? Sometimes do you want a fast model? Or you want an accurate model, because there are two different things.

[00:25:35] Ankit Raheja: Do you want your model to be available on edge or you want it to be on cloud? So you need to worry about the infrastructure metrics also. To summarize three kinds of metrics, business and customer metrics, algorithmic metrics, and ML infrastructure and production based metrics.

[00:25:56] Ankit Raheja: So

[00:25:56] Himakara Pieris: I presume the first and last buckets customer metrics [00:26:00] and infrastructure metrics are more visible, but algorithmic metrics tend to be less so. And sometimes they reflect in the other two. Do you keep the customer, especially early adopters or like, you know, development partners in loop about algorithmic metrics?

[00:26:17] Himakara Pieris: And what is the level of conversation there as you share this with the business groups?

[00:26:22] Ankit Raheja: Oh, perfect. Again Cisco was an amazing playbook that I can talk to you about like, so and this came so often number one rule like I follow this product management guru and I'll give a shout out to him.

[00:26:39] Ankit Raheja: His name is Shreyas Doshi. He says you are not dev complete. unless you have your telemetry built in. So the step one for making sure that you are GA, in fact, we have it at CDK also, that you need to build your telemetry in place. If you have your telemetry built in place, things become a lot more [00:27:00] easier later.

[00:27:01] Ankit Raheja: I'll give you an example for Cisco, a chatbot, right? A chatbot is there and it is giving a prediction of, of, of saying that, hey, this product should be returned. We will be having this weekly meeting with our developers and we'll tell the developers that, hey, this is where the chatbot is telling that we should return this product.

[00:27:28] Ankit Raheja: The human, when he's coming in and he's checking it, they're saying, no, it should not be written. So we had a Tableau dashboard that was capturing the model scores as well as the, the human recommendation and whenever there was a delta. We will be surfacing that to the development team, they they will take it, they will retrain the model and then like that's how we will keep on iterating over it.

[00:27:58] Ankit Raheja: So, but the first thing is if you [00:28:00] don't have telemetry in place. You will be thinking you'll be regretting why I don't have that built in first place. So step one in ML model, have your telemetry built in place, store it in some database, have some kind of a way to surface those results because otherwise it's just opinions.

[00:28:19] Himakara Pieris: That's really good advice. On the back of that, are there any other advice you'd like to offer to product leaders, product managers who are interested in getting into AI and, and building AI-powered solutions?

[00:28:34] Ankit Raheja: Yeah definitely. The, the first thing that I will request our product leaders is like I'm all about being as transparent and as inclusive as possible here.

[00:28:49] Ankit Raheja: Do not think machine learning is some ivory tower. It was. Thought of SM I've retired earlier. And now you have seen with [00:29:00] the amazing GTM strategy and the product strategy of open AI, like so many folks are jumping on the bad wagon and I do not want the product leaders to be left behind. I want them to have it as one of the items.

[00:29:15] Ankit Raheja: in their toolkit, like, but this is not your front and center. The first thing that you need to understand is how and where can you use AI? I'll give you a few examples here. Something like a SaaS product. Are you in a SaaS product layer? For example like Amazon Recommendation Engine, Coda AI, Notion AI.

[00:29:38] Ankit Raheja: That's like a SaaS product adding AI to their products. Second one could be algorithmic products. Anthropic, OpenAI, Facebook Llama, like that's the second place where you can think about like where you can use AI. Third is [00:30:00] AI infrastructure software companies. There's a company called Cube. That's one example.

[00:30:06] Ankit Raheja: The fourth is AI infrastructure hardware companies. As you can imagine, NVIDIA's stock is at an all time high. Again, it's a trillion dollar company, all because of its GPUs and then with AI and with Snowflake partnership, yeah, it's taken to the next level. So, so think first of all, which What space do you operate in or where do you think that you can use AI?

[00:30:32] Ankit Raheja: Number two, start small by leveraging data because at least you can have some kind of proof of concept to decide whether AI makes sense. Because what's happening now in the industry is that we are able to, number one use a lot of annotation services, which can annotate data for you.[00:31:00]

[00:31:00] Ankit Raheja: Number two, you can also create your data synthetically. There are a lot of use cases there. Last but not least, check out these amazing websites. One is artificialintelligentnews. com. You can look at Andrew Ng's batch at Deep Learning AI. He really comes up with amazing content and last but not the least, you can always ask chat GPT, which is trained on billions of parameters to, to tell you that what could be some use cases.

[00:31:30] Ankit Raheja: At least it's the first step.

[00:31:32] Himakara Pieris: very much for coming on the podcast today. Is there anything else you'd like to share as well?

[00:31:40] Ankit Raheja: The only thing that I would probably like to share is about if you are a car dealership company and so CDK has shared an amazing AI survey and I will provide you Imkara in the case notes.

[00:31:59] Ankit Raheja: So if [00:32:00] any car dealerships is listening to it, they can always look at that link and yeah, and we are excited to talk to you.

  continue reading

15 episoder

Kaikki jaksot

×
 
Loading …

Välkommen till Player FM

Player FM scannar webben för högkvalitativa podcasts för dig att njuta av nu direkt. Den är den bästa podcast-appen och den fungerar med Android, Iphone och webben. Bli medlem för att synka prenumerationer mellan enheter.

 

Snabbguide