Professional Community for Car Dealers, Marketing, Advertising and Sales Leaders
To understand why there’s so much support behind LIDAR today, it’s important to look at other similar technologies which have similar goals.
The original depth-sensing robot was the humble Bat (50 million years old!). A bat (or dolphin, among others) is able to perform some of the same capabilities as LIDAR using echolocation, otherwise known as Sonar (sound navigation and ranging). Instead of measuring light beams like LIDAR, Sonar measures distance using sound waves.
After 50 million years of biological exclusivity, World War 1 advanced the timeline of the first major deployment of man-made Sonar sensors, with the advent of submarine warfare. Sonar works excellently in water, where sound travels far better than light or radio waves (more on that in a second). Sonar sensors are in active use on cars today, primarily in the form of parking sensors. These short-range (~5m) sensors enable a cheap way to know just how far that wall is behind your car. Sonar hasn’t been proven to work at the kinds of ranges a self-driving car demands (60m+).
Radar (radio direction and ranging), much like Sonar, was another technology developed during an infamous World War (WW2, this time). Instead of using light or sound waves, it instead utilizes radio waves to measure distance. We make use of a lot of Radar (using Delphi sensors) on Homer, and it’s a tried-and-tested method that can accurately detect and track objects as far as 200m away.
Radar has very little in terms of downside. It performs well in extreme weather conditions and is available at an affordable pricepoint. Radar is heavily used not only for detection of objects, but tracking them too (ex: understanding how fast a car is going and in which direction). Radar doesn’t necessarily give you granularity of LIDAR, but Radar and LIDAR are very complimentary, and it’s definitely not either/or.
LIDAR was born in the 1960s, just after the advent of the laser. During the Apollo 15 mission in 1971, astronauts mapped the surface of the moon, giving the public the first glimpse of what LIDAR could do.
Before LIDAR was even considered for automotive and self-driving use, one of the popular use-cases of LIDAR was archeology. LIDAR provides a ton of value for mapping large-scale swaths of land, and both archeology and agriculture benefitted tremendously from it.
“When Lidar was first used at Angamuco we had no idea how large the area was that included buildings and structures, if it was even a city,” team member Professor Steve Leisz told the BBC. Perhaps more surprisingly the team also found a ball court for a Meso American game called pok-ta-pok, and pyramids, including one that Fisher had walked within 10m of the previous year. “That was a complete surprise,” said Leisz. — Lidar archaeology shines a light on hidden sites
It wasn’t until the 2000s when LIDAR was first utilized on cars, where it was made famous by Stanley (and later, Junior) in the 2005 Grand DARPA Challenge.
Stanley, the winner of the 2005 Grand DARPA Challenge, made use of 5 SICK LIDAR sensors mounted on the roof, in addition to a military-grade GPS, gyroscopes, accelerometers and a forward-facing camera looking out 80m+. All of this was powered by six 1.6GHz Pentium Linux PCs sitting in the trunk.
The fundamental challenge with the SICK LIDARs (which powered a significant portion of the 2005 challenge vehicles) is that each laser scan is essentially a cut made by a single plane, and so you had to be methodical in how you pointed them. Many teams mounted them on tilting stages, in order to use them to “sweep” a segment of space. In simple terms: SICK was a 2D LIDAR (a few beams of light in one direction) vs. the modern 3D LIDARs (tons of beams of light in all directions) we know today.
Velodyne has long been the market leader in LIDAR, however they didn’t start out life that way. Velodyne began life as an audio company in 1983, specializing in low-frequency sound and subwoofer technology. The subwoofers contained custom sensors, DSPs and custom DSP control algorithms. Velodyne became the LIDAR company we know today at the same time as Stanley’s debut. Velodyne founders David and Bruce Hall first entered the 2004 DARPA competition as Team DAD (Digital Audio Drive). For the second race In 2005, David Hall invented and patented the 3D laser-based real-time system that laid the foundation for Velodyne’s current LIDAR products today. By the 3rd DARPA challenge in 2007, the majority of teams used this technology as the basis of their perception system. David Hall’s invention is now in the Smithsonian as a foundational breakthrough enabling autonomous driving.
Team DAD in 2005
The first Velodyne LIDAR scanner was about 30 inches in diameter and weighed close to 100 pounds. Choosing to commercialize the LIDAR scanner instead of competing in subsequent challenge events, Velodyne was able to dramatically reduce the sensor’s size and weight while also improving performance. Velodyne’s HDL-64E LIDAR sensor was the primary means of terrain map construction and obstacle detection for all the top DARPA Urban Challenge teams in 2007 and used by five out of six of the finishing teams, including the winning and second-place teams.
Some teams relied exclusively on the LIDAR for the information about the environment used to navigate an autonomous vehicle through a simulated urban environment. – Wikipedia
Why did LIDAR take off with self-driving cars? In a word: mapping. LIDAR allows you to generate huge 3D maps (its original application!), which you can then navigate the car or robot predictably within. By using a LIDAR to map and navigate an environment, you can know ahead of time the bounds of a lane, or that there is a stop sign or traffic light 500m ahead. This kind of predictability is exactly what a technology like self-driving cars requires, and has been a big reason for the progress over the last 5 years.
As LIDARs have become higher-resolution and operate at longer ranges, a new use-case has emerged in object detection and tracking. Not only can a LIDAR map enable you to know precisely where you are in the world and help you navigate it, but it can also detect and track obstacles like cars, pedestrians and according to Waymo, football helmets.
Modern LIDAR enables you to differentiate between a person on a bike or a person walking, and even at what speed and which direction they are going in.
A Google car
The combination of amazing navigation, predictability and high-resolution object tracking has meant that LIDAR is the key sensor in self-driving cars today, and it’s hard to see that domination changing. Unless…
The industry is marching ahead with a real focus on: cost decrease and resolution and range increase.
Solid-state LIDAR opens up the potential of sub-$1k powerful LIDAR units, which today can cost as much as $80k a unit. LeddarTech are one of the leaders in this early market.
Here’s what Velodyne has to say about solid-state:
Solid state, fixed sensors are driven by the idea that you want an embeddable sensor with the smallest size at the lowest possible cost. Naturally, that also means that you have a smaller field of view. Velodyne supports both fixed and surround view sensors. The fixed sensors are miniaturized to be embedded. From a cost standpoint, both contain lenses, lasers and detectors. The lowest cost system is actually via surround view sensors because rotation reuses the lens, lasers and detectors across the field of view, versus using additional sensors each containing individual lenses, lasers and detectors. This reuse is both the most economical, as well as the most powerful, as it reduces the error associated with merging different points of view in real-time — something that really counts when the vehicle is moving at speed.
Resolution and Range Increase
The huge jump in the number of applications for LIDAR has brought with it a flood of talented founders and teams starting companies in the space. Higher resolution output and increased tracking range (200m in some cases) will provide better object recognition and tracking, and are one of the key differentiators in sensors from startups like Luminar.
At Voyage, they’ve placed a bet on LIDAR. They love all the benefits that it brings, and believe the ecosystem will take care of bringing down the cost just in time for when they need to scale their autonomous taxi service.
If you’re a LIDAR startup and want to test your sensors, Voyage would love to be one of your first customers. Reach out on their website!
There’s a number of startups out there approaching the problem of self-driving cars using purely cameras (and perhaps radar), with no LIDAR in sight. Tesla is the biggest company of the bunch, and Elon Musk has repeatedly pushed the idea that if humans can perceive and navigate the world using just eyes, ears and a brain, then why can’t a car? I’m certain that this approach will achieve amazing results, especially as other talented teams work toward this goal, including Comma and AutoX.
It’s important to note that Tesla has an interesting constraint that may have factored in to their decision: scale. Tesla hopes to ship 500k cars a year very soon, and can’t wait for LIDAR to come down in cost (or be manufactured in volume) tomorrow, it needed to happen yesterday!