A Look into Uber’s Futuristic Self-Driving Cars Technology

A Look into Uber’s Futuristic Self-Driving Cars Technology
Aug 28, 2020

Earlier this year, Uber’s ATG (Advanced Technologies Group) and Data Visualization teams teamed up to improve the performance of self-driving vehicles. Uber isn’t the only one on a fast-sprint in this race to the future. But it has outdone its competitors by leaps and bounds. ARTiBA has tried to decode the secret to this success.

If COVID-19 couldn’t dim one zeal - It’s the zeal of AI. Artificial Intelligence is in the headlines back again. This time with greater fury and with a mission to change the world. AI is going to take the world places. Literally!

Tracing The Trajectory of the Moonshot

Let’s face it. This wasn’t a happy ride. The success of AI-enabled autonomous vehicles was long mistrusted. Speculations were abounding, from adversaries questioning the technology’s success potential and impact to critics calling it yet another internet, bitcoin, or blockchain bubble.

That was until the unprecedented 2020 knocked. Things changed pretty fast then.

While 2020 was rattling almost every other business, driverless cars were becoming a known face in the headlines the world over; One of the very few happy headlines, their audience hooked for the latest developments, tracking every single move. It’s not just the automotive manufacturers who were racing to find the sweet spots.

The list of frontrunners was and still is ever-expanding and covers a wholesome umbrella of big names and budding startups. An even battlefield for tech champions and innovators, driverless technology has captivated behemoths and beginners alike: Nvidia, Intel, Google, BMW, Microsoft, Aurora, Cruise, and Uber.

How Driverless Cars Work

Before we learn about Uber’s work, let’s dig into the basic functioning of the self-driving cars.

SAE International has categorized self-driving cars into six levels of functioning. Level zero refers to no automation and level five is fully automated with no driver assistance or monitoring.

Self-driving cars need sensors to perceive the activities around them.These provide information on the future location of the object and the suitable route the vehicle needs to take.

How Uber is Improving the Self-Driving Vehicles Experience

“The ATG Visualization team built a platform that enables engineers and operators across ATG to quickly inspect, debug, and explore information collected from offline and online testing.”

Here are a few of its most important components:

VerCD

A set of microservices and tools, VerCD, plays an important part in prototyping. “It tracks the dependencies among the various codebases, data sets, and AI models under development, ensuring that workflows start with a data set extraction stage , which is followed by data validation, model training, model evaluation, and model serving stages”, says Uber.

VerCD increases the “frequency of fresh data set builds by a factor of 10, leading to significant efficiency gains.”

Web as platform:

Uber used the web for three primary reasons:

  • Fast iterations: Web is quick and easy to use. The user can simply refresh the browser to get the latest information.

  • Flexibility and shareability: Anyone can use the web. With just one click, an incidence can be diagnosed and reported.

  • Customization and collaboration: Self-driving cars need a constant feed of new services and features. HTML5 and JavaScript can customize UI on the fly.

Data depiction:

Creating a holistic context in self-driving cars requires a large amount of data from maps and runtime vehicle logs. Maps for self-driving vehicles are more intricate than regular maps. They give detailed information on the ground surface, lane types and boundaries, speed limits, turns, crosswalks, and impediments.

Uber’s system is capable of presenting complex data in the form of precise visual metaphors. It shows realistic images and also suggests alternative decisions.

To achieve this, ATG’s visualization team engages a suite of frameworks. These include web-based large scale data visualization frameworks like react-map-gl and deck.gl. GPU capacities are used to display “millions of geometries at a high frame rate”. Deck.gi renders the data in the needed look (ground, path, lanes, etc.), with a typical log snippet rendering 60-100 layers at 30-50 frames per second.

A Sneak Peek at The Future:

While this looks like a promising start, the way driverless technology has emerged hints at a lot many shake-ups to come.

But, artificial intelligence isn’t happy news for everyone. For many, it is a harbinger of radical, almost sweeping, changes. AI beckons transformative times for the automotive industry. And Uber’s ATG is leading this a-march-to-the-future troupe.

Are you game?

Follow Us!

Conversational Ai Best Practices: Strategies for Implementation and Success
Brought to you by ARTiBA
Artificial Intelligence Certification
Conversational Ai Best Practices: Strategies for Implementation and Success

Conversational Ai Best Practices:
Strategies for Implementation and Success

The future is promising with conversational Ai leading the way. This guide provides a roadmap to seamlessly integrate conversational Ai, enabling virtual assistants to enhance user engagement in augmented or virtual reality environments.

  • Mechanism of Conversational Ai
  • Application of Conversational Ai
  • It's Advantages
  • Using Conversational Ai in your Organization
  • Real-World Examples
  • Evolution of Conversational Ai
Download
X

This website uses cookies to enhance website functionalities and improve your online experience. By browsing this website, you agree to the use of cookies as outlined in our Privacy Policy.

Got it