Self-driving cars need a common language that everyone can understand


Autonomous vehicle companies are exploring the use of a common language – light patterns or standardized sounds – that would help driverless cars communicate their intentions to humans.

Why is this important: Autonomous vehicles will share the road with human-powered vehicles, pedestrians and cyclists for a long time. Developing a standard method of communication could build trust and reduce traffic accidents.

  • Unlike today’s drivers, AVs cannot make eye contact with other road users or make gestures to indicate that it is okay to cross the road.

What is happening: Argo.ai, the developer of an autonomous driving system, urges his fellow developers to adopt his new technical guidelines for safe interactions between robocars and cyclists.

  • The guidelines, created in collaboration with the League of American Bicyclists, urge audiovisual companies to incorporate cycle lanes into their AI maps, for example, and to model typical rider behavior – such as lane doubling or bypassing. of an open car door – in their algorithms. .
  • Self-driving cars should also be programmed to slow down and create extra space when it’s unclear what a cyclist might be doing, according to Argo guidelines.
  • “Roads have become much less safe for people outside of vehicles over the past decade,” Ken McLeod, League of American Bicyclists policy director, said in a statement.
  • “By addressing interactions with cyclists now, Argo demonstrates its commitment to the role of automated technology in reversing this deadly trend. ”

Meanwhile, companies are also trying to find a common language for autonomous vehicles.

  • Some companies promote what they see as best practices through voluntary safety self-assessments of their autonomous driving technology which they filed with the National Highway Traffic Safety Administration.
  • Ford Motor Safety Report, for example, describes a white light bar mounted near the top of the windshield where a pedestrian or cyclist can look for clues of a human driver.
  • Ford worked with researchers at the Virginia Tech Transportation Institute to develop different signal models to indicate that an AV is picking up or dropping off, for example, or recognizes another road user by following and tracking their movement.
  • Ford says it is leading an initiative to create a standard method of external visual communication.

Zoox, meanwhile, who makes a custom robotaxi, tests its own communication models using a variety of lights incorporated into its design.

  • The vehicle, which has neither front nor rear, also has 32 speakers capable of locating sound in a specific direction to communicate with other road users.
  • “We can be more intentional in the way we communicate, in order to target specific users and to ensure that sounds are heard by the right people with the right tone,” Riccardo Giraldi, senior director of the product experience at Zoox.
  • “At the moment there is only one sound – the horn – which is annoying for cities,” he said.

The big picture: American roads are becoming increasingly deadly, with an 18% increase in road fatalities in the United States in the first half of 2021, mainly due to risky behaviors like speeding, texting or driving in drunk.

  • The United States is on track for more than 40,000 road fatalities in 2021 – “a crisis,” according to Transportation Secretary Pete Buttigieg.

The bottom line: Just as everyone understands the meaning of red, yellow and green lights, AVs will need to develop standard means of communication.


Comments are closed.