Earlier this year, Uber’s ATG (Advanced Technologies Group) and Data Visualization teams teamed up to improve the performance of self-driving vehicles. Uber isn’t the only one on a fast-sprint in this race to the future. But it has outdone its competitors by leaps and bounds. ARTiBA has tried to decode the secret to this success.
If COVID-19 couldn’t dim one zeal - It’s the zeal of AI. Artificial Intelligence is in the headlines back again. This time with greater fury and with a mission to change the world. AI is going to take the world places. Literally!
Let’s face it. This wasn’t a happy ride. The success of AI-enabled autonomous vehicles was long mistrusted. Speculations were abounding, from adversaries questioning the technology’s success potential and impact to critics calling it yet another internet, bitcoin, or blockchain bubble.
That was until the unprecedented 2020 knocked. Things changed pretty fast then.
While 2020 was rattling almost every other business, driverless cars were becoming a known face in the headlines the world over; One of the very few happy headlines, their audience hooked for the latest developments, tracking every single move. It’s not just the automotive manufacturers who were racing to find the sweet spots.
Before we learn about Uber’s work, let’s dig into the basic functioning of the self-driving cars.
SAE International has categorized self-driving cars into six levels of functioning. Level zero refers to no automation and level five is fully automated with no driver assistance or monitoring.
Self-driving cars need sensors to perceive the activities around them.These provide information on the future location of the object and the suitable route the vehicle needs to take.
Here are a few of its most important components:
A set of microservices and tools, VerCD, plays an important part in prototyping. “It tracks the dependencies among the various codebases, data sets, and AI models under development, ensuring that workflows start with a data set extraction stage , which is followed by data validation, model training, model evaluation, and model serving stages”, says Uber.
Uber used the web for three primary reasons:
Fast iterations: Web is quick and easy to use. The user can simply refresh the browser to get the latest information.
Flexibility and shareability: Anyone can use the web. With just one click, an incidence can be diagnosed and reported.
Creating a holistic context in self-driving cars requires a large amount of data from maps and runtime vehicle logs. Maps for self-driving vehicles are more intricate than regular maps. They give detailed information on the ground surface, lane types and boundaries, speed limits, turns, crosswalks, and impediments.
To achieve this, ATG’s visualization team engages a suite of frameworks. These include web-based large scale data visualization frameworks like react-map-gl and deck.gl. GPU capacities are used to display “millions of geometries at a high frame rate”. Deck.gi renders the data in the needed look (ground, path, lanes, etc.), with a typical log snippet rendering 60-100 layers at 30-50 frames per second.
While this looks like a promising start, the way driverless technology has emerged hints at a lot many shake-ups to come.
But, artificial intelligence isn’t happy news for everyone. For many, it is a harbinger of radical, almost sweeping, changes. AI beckons transformative times for the automotive industry. And Uber’s ATG is leading this a-march-to-the-future troupe.
Are you game?