Madan Thangavelu, director of engineering at Uber, provided some insights on real world applications of AI and edge to improve customer service fulfillment during a recent presentation at the Mobile Edge Forum 2022, available on demand here.
“When I think about edge, there’re just two principles that I’m always thinking about, which is in computer science and building large scale systems, we’re always trying to get computation close to data. All the big data systems do that. And then the other is get your computation close to your users. And that’s where we do. When you’re talking about central cloud, we get the data centers out in so many different geographic regions, we get our CDNs, we get our points of presence. So in some sense, MEC is a new entrant into this but there are people who have been trying to achieve the various aspects of edge in production and through these other means. And it’s not an entirely new concept when it comes to B2C systems,” Thangavelu said.
Thangavelu also provided insights on a particular use case in which Uber is implementing edge technology for the smooth provision of some of its services. “MEC, it’s not an entirely new concept when it comes to B2C systems. Today, we do things like the caching, and you probably heard about the way Netflix has built their open connect, where they add the ISPs, they cache their content in order to not have huge latency. So clearly, caching and those kinds of applications have come into a lot of the B2C and people have found innovative ways to push that around,” he said.
“One use case that’s interesting that I can call out is, especially in a company like Uber, you have the drivers and the riders on the street. We have this case where GPS locations are extremely important to get on time. And if you think about a round trip where we have to trust the GPS locations, and then extract—essentially, stream it to the rider to know where the driver is—seems like a very straightforward use case. Now, ideally, what you want is, by the time those two people are close, by within a few blocks, you really need to get a very high throughput exchange between the driver and rider. And a lot of times that will fundamentally change the stress involved in getting a pickup happen. So at such very simple use case, we can’t push a lot of the tech down; we cannot do that at the device. Because there’s an exchange going on between the driver and rider, we do need to scrub clean up, fit into the map, and then essentially transfer. That’s where we would want to put some of that [edge] capabilities,” he said.
Thangavelu also talked about the emerging role of artificial intelligence (AI) and machine learning (ML) when it comes to managing data at the edge.
“Companies largely put their ML model training with all the data in the central cloud, they build their models and a lot of times the devices actually send the data to the backend. So that’s one model. And the other one is, clearly you have to put your ML model down into your app. And when you think about companies like Snap, when they do VR, you want to do object detection, so you can put a lot of that model down to the app,” he said.
“I feel like a lot of the really critical AI that is meaningful to a business is mostly on the devices. And those will be the ones that will have the first move out, because now you get the leverage of moving fast, you don’t have to wait for a device firmware update to get your AI out to the devices,” the Uber executive added.