Skip to Main Content

How AVs work: LiDARs, radars and cameras, oh my!

We often get asked: how do May Mobility autonomous vehicles work? If this is a question that’s been on your mind, you’ve come to the right place.

A close-up of a May Mobility vehicle seen from the side and displaying "Autonomous Vehicle" on the window.

Since opening our doors in 2017, May Mobility has transformed mobility as we work to design the world’s best autonomous vehicle (AV) technology. The transit agencies, cities and businesses we partner with rely on us to fill transit gaps and solve unique problems in their communities.

But for all the benefits we bring to our clients, we often get asked: how do May Mobility autonomous vehicles actually work? If this is a question that’s been on your mind, you’ve come to the right place.

How do autonomous vehicles work?

AVs are equipped with sensors that let them build a map of their environment. This information is processed by autonomous driving software in the vehicle’s computer. Vehicles with autonomous driving levels 3 and above can determine a suitable path and tell the vehicles whether they should accelerate, brake or steer at any given moment. May Mobility designs Level 4 autonomous vehicles, which can handle complex, nuanced driving situations without any driver intervention.

The two key technologies behind May Mobility autonomous vehicles

The exceptional safety and performance of May Mobility AVs come from two cooperative systems: our combined sensor stack and our revolutionary Multi-Policy Decision Making (MPDM) platform. By working together, these systems aim to allow the vehicle to react quickly to ensure the safety of every road user.

Combined sensor stack

Our vehicles feature a platform-agnostic Autonomous Driving Kit (ADK) composed of the autonomous driving node (ADN) computer within the car and cameras, radars and LiDAR (Light Detection and Ranging) sensors on top and around the vehicle. These provide a complete view of the vehicle’s surroundings, including static elements, dynamic elements and environmental conditions. The detailed information they gather is then sent to the ADN.

A combination of cameras with narrow- and wide-angle lenses allows the AV to classify and differentiate between objects around the vehicle. They also serve to detect traffic lights and look out for vehicles and pedestrians. Each camera is paired with a LiDAR for more comprehensive detection capabilities.

LiDAR is a laser-based sensor that bounces pulses of light off the vehicle’s surroundings to build a 3D representation of its environment. This allows it to measure distances and identify and classify objects and obstacles in its environment. Our AVs hold one large, long-range LiDAR on top, two smaller ones on each side and one on the front and the rear for comprehensive perception and localization of the area

Radar is a sensor that uses radio waves to provide accurate information about the distances and velocities of surrounding elements. Radars are positioned on each corner and in the front, allowing the AV to detect how fast or slow vehicles and pedestrians are moving and react accordingly.

How do these systems function together?

Each type of sensor technology has its own strengths and weaknesses, but each provides a piece to the larger puzzle. By working together, they provide all the information the ADN needs to maintain a complete 360-degree view of the environment up to 120 meters away. While some sensors provide overlapping data, it is that redundancy that helps the vehicle to consistently make safe driving decisions.

Multi-Policy Decision Making platform

One of the biggest challenges to autonomous driving is safely handling the unexpected. Some AV companies train their software using heavy data sets and machine learning. But if their vehicle encounters a situation that these data sets didn’t capture, it may not know how to handle it in a robust – or perhaps even safe – manner.

Instead, it’s important to have a vehicle that can generalize. This is the technical term for a vehicle that can handle situations it wasn’t programmed for. In other words, it’s an AV that can imagine novel situations and make decisions like a real human driver.

Through the use of a high-end NVIDIA graphics processor unit (GPU), imagery from the combined sensor stack is processed within milliseconds, ready for our proprietary Multi-Policy Decision Making (MPDM) technology to do its work. Our MPDM platform uses that constant stream of data to run thousands of simulations per second. These allow it to predict potential hazards and choose the safest course of action at every moment. MPDM doesn’t just calculate what cars, pedestrians and other dynamic agents might do, but also how those actions might interact and change. All of this makes MPDM especially capable of handling the full range of possible driving conditions in the real world.

A close-up, front-facing studio view of May Mobility's Toyota Sienna displaying the MPDM software on the windshield
MPDM in Action

The union of our autonomous driving kit and MPDM platform empowers us to solve some of the biggest problems in autonomous driving. When you step into one of our vehicles, you can feel confident in our ability to get you to where you’re going safely, even when the unexpected occurs. To learn more, access our Resource Library for whitepapers and videos about our vehicles and technology.

More News & Stories

View All

  • Detroit ADS Pilot Safety Testing to Commence at American Center for Mobility

    May 1, 2024

  • Women of May: Leading the charge

    March 28, 2024

  • May Mobility recognized on Fast Company’s 2024 list of the Most Innovative Companies

    March 19, 2024

  • Investment in U.S.-Based May Mobility, Inc. and Cooperation with Investment Partners to Provide Autonomous Driving Services Toward Achieving a Sustainable Mobility Society

    March 19, 2024

  • 自動運転技術の開発を手掛け、北米および日本において自動運転サービスの普及・展開を目指す米国May Mobility, Inc.(以下、May社)に、当社がこのたび出資したことをお知らせいたします。

    March 19, 2024

  • Women of May: Here to change the world

    March 6, 2024

  • It takes a village: Creating safe and equitable transportation

    February 22, 2024

  • Less parking, more public transit

    February 7, 2024

  • The future of autonomous technology at CES 2024

    January 23, 2024

Two professionals exit a May Mobility vehicle

Bring May Mobility To Your Community

We love meeting transit agencies, cities, campuses, organizations and businesses where they are to bring autonomy to their mobility ecosystem—and fill their transportation gaps for the long haul. Ready to partner up? Let’s talk.

  1. Contact Us