Skip to main content

Watch Mobileye’s self-driving car drive through Jerusalem using only cameras

Watch Mobileye’s self-driving car drive through Jerusalem using only cameras

/

12 cameras, and that’s it

Share this story

Screengrab from Mobileye demonstration.
Screengrab from Mobileye demonstration.

When it comes to self-driving cars, the general axiom for sensors is “the more the merrier.” The safest systems are the ones that use a multiplicity of sensors, such as cameras, radar, ultrasonic, and LIDAR. Having these redundant sensors is the whole point: if one fails, the remaining sensor suite can help navigate the car to safety.

Mobileye, a company that specializes in chips for vision-based autonomous vehicles, believes in redundancy, but it also believes in the power of its camera-based system. At the Consumer Electronics Show in Las Vegas this week, the Intel-owned company demonstrated how one of its autonomous test vehicles navigated the complex streets of Jerusalem using cameras only.

The vehicle’s sensor suite includes 12 cameras... and that’s it!

The vehicle’s sensor suite includes 12 cameras.... and that’s it! No radar, no ultrasonic, and no LIDAR. LIDAR, which stands for light detection and ranging, are laser sensors that most tech and car companies see as an essential component for self-driving cars. The sensors are placed on the roofs of autonomous vehicles as well as the sides and grilles where they send out thousands of laser points to map the surrounding environment.

But for Mobileye, it’s all about the cameras. In the video, a Mobileye employee says that the two-dimensional information coming from the cameras is extracted into a 3D model of the environment using “a chain of algorithmic redundancies based on multiple computer vision engines and deep networks.”

That 3D environment is displayed during the video alongside a shot of the safety driver with his hands in his lap as well as a bird’s-eye view of the car from a drone overhead. The car navigates complex driving situations, such as a non-signaled four-way intersection, traffic merges, and an unprotected left turn into heavy traffic — all while hitting speeds of up to 40 mph (64 km/h).

Of course, there’s a lot we don’t know about this drive, such as how many times Mobileye has driven this exact route and what, if any, remote assistance is being offered. But it’s certainly impressive to see an autonomous vehicle navigate such complex environments using just camera data.

there’s a lot we don’t know about this drive

Mobileye isn’t the only company that’s bullish on cameras. Tesla CEO Elon Musk has made his disdain for LIDAR widely known. Laser sensors are a “fool’s errand,” he said at an event for investors last year, and “anyone relying on LIDAR is doomed. Doomed. Expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices.” Musk has argued that a camera and radar-based system, coupled with a powerful AI software, can compensate for the lack of laser sensing.

Mobileye used to supply its computer vision hardware to Tesla for use in the automaker’s Autopilot system. But the two companies parted ways in 2016 after a Florida man was killed when his Tesla vehicle ran into a tractor-trailer while using Autopilot. Tesla claimed Mobileye was trying to block the carmaker’s in-house efforts to develop its own image recognition software, while Mobileye cites concerns with the safety of Autopilot.

Since it was acquired by Intel in 2017, Mobileye has announced partnerships with other automakers, including Volkswagen and China’s NIO and SAIC. The company is testing a self-driving taxi service in Israel, and just announced plans to deploy its robot taxis in South Korea.