“Millimeter Wave Radar is a fool’s errand, and anyone relying on it is doomed!”
I didn’t say those exact words, but whatever I did say amounted to the same thing. I laughed out loud as I said them. You would have laughed too, if you saw what I was looking at.
I was standing on the tarmac at Yuma Marine Corp Air Station under the hot Arizona sun a few days before Christmas, 1990. In front of me was a Huey UH-1 helicopter with a kitchen oven out of a Richard Scarry nightmare mounted on its nose. The kitchen oven had four absurd squarish cones sticking out the front. It had oddly shaped copper piping bulging over the sides and the top, which could have been models of internal organs from a Dr Suess character. This contraption would have looked silly in a science lab. Mounted on the nose of a military helicopter it looked absolutely ridiculous.
The kitchen oven was a millimeter wave radar.
The helicopter was one of six parked in two rows in a hangar. Most had teams of engineers scurrying all over them, making final adjustments before the first round of flight tests began. My boss Gary and I had already finished. Our shoebox-sized system looked quite sophisticated and elegant relative to the monstrosity I was laughing at. Our LiDAR looked quite at home on the nose of a military helicopter complete with subtle heat sink ridges and flat army green paint.
Gary and I had joined teams of other R&D companies and labs answering a call from the DOD as it geared up for its first invasion of Iraq. Army aviation had a new concern as it deployed in scale to Kuwait, and that problem was called “controlled flight into terrain” or CFIT. I called it “crashing with style.”
I had already spent enough time working on machine perception projects involving helicopters and LiDAR to have a healthy respect for Army helicopter pilots. The things Army pilots could do, even with a Vietnam-era Huey, were incredible by any metric. CFIT was not the result of any lack of skill, but a function of pushing technical limits in a new environment. Helicopter pilots need to fly their blackhawks very low and very fast in the dark. That is a good way to get shot at less. Under conditions common in the Iraqi desert, night vision goggles could not see sand dunes. It was not rare for a helicopter to gracefully plow into an especially tall sand dune. Casualties were minor. Sand that was invisible to thermal scopes and photomultiplier tubes was uniform and soft. It always meant the end of that night’s mission.
Prototypes from labs all over the country had been rushed to Yuma, near Imperial Sands, to test possible solutions in conditions as close to the deserts of Iraq as you can find in the United States. Over the next few days and nights we learned that our LiDAR system performed exactly as we had hoped. It could warn pilots of obstacles with no ambient light, with no thermal signature, while not giving away its position, something the kitchen oven could not do.
Elon Musk is fond of provocative statements like “LiDAR is a Fool’s errand and anyone who uses it is doomed.” Provocative statements get people’s attention, but is that statement true, even within a context relevant to Tesla?
At last April’s Autonomy Day Tesla gave an incredible presentation in which they described their ability to construct scene information from simple, passive camera systems. In the presentation they make fun of LiDAR as a sensor, because “Nobody drove here today with laser beams coming out of their heads.” It’s a good line. You can use a line like that to dismiss airplanes, elevators, cell phones, and antibiotics.
Why is Elon against LiDAR? He is unambiguous about it, just as I was when I laughed at that Millimeter Wave Radar in Arizona.
It has been 13 years since Velodyne started selling LiDAR systems for autonomous vehicles to support teams in the DARPA challenge. Over a billion dollars of investment have gone into dozens of automotive-focused LiDAR companies since then, but low-cost effective LiDAR has failed to materialize. According to Velodyne’s own IPO pitch, the state of the art LiDAR system for automotive is still a “spinning beanie” that costs over seven thousand dollars.
Disassemble any of the products from the companies that are now going IPO through SPACs or raising money with follow-on rounds from earlier investors and you will find systems that look eerily similar to what Gary and I were using in Yuma in the early ’90s. Laser transmitters, discrete optics, receivers, amplifiers, and motors moving parts around to steer lasers. It is unsurprising that cost reductions for these complicated systems have not materialized.
Velodyne’s presentation calls for a reduction of its products average sale price to drop from over $7K from 2019 to $600 by 2024. Elon and the entire transportation industry have been hearing these announcements for the last decade from a flock of well-funded companies. It has not happened yet. Elon’s skepticism is well-founded.
Despite its cost, despite its complexity, just as we found out in Yuma, LiDAR is useful for a variety of applications, Tesla’s Autopilot being only one.
Using its own energy, LiDAR can provide data where cameras cannot see. LiDAR provides spatial resolution that radar cannot match, and, if done right, operates with near perfect interference immunity from any environmental sources or other LiDARs.
High resolution 3D data generated by LiDAR turns scene analysis and real-time navigation tasks into a 5th grade geometry problem. No need for AI algorithms that simulate a cerebral cortex.
Reduce the size, reduce the cost, increase the capabilities, and even Elon will have to change his mind about LiDAR.
By implementing all the tricky parts of LiDAR on a single semiconductor chip and by using off-the-shelf subcomponents and processes already used by the semiconductor industry Voyant can make imaging LiDAR devices that consist of nothing more than two small circuit boards and an ordinary lens tucked into a 5 cm long case. No motors. No discrete optical components. No external lasers. We expect production costs around 1/20th of Velodyne’s current average sale price.
Beyond being tiny and inexpensive, with no moving parts, our LiDAR offers features that cameras and radar cannot match.
For starters, our LiDAR is not stuck scanning a field of view in a fixed pattern determined by motor-driven optics. It can focus on any number of regions of interest, within the field of view, at any update rate an application might need, with no lag or hysteresis. Very handy for seeing into glare or shadow areas when cameras are blinded by environmental lighting changes. Very handy for tracking specific objects or hazards at fast update rates, without waiting for a motor to spin the detector all the way around the field of view….
FMCW LiDAR works in any amount of ambient light. Or none. Immunity to ambient lighting conditions is an advantage FMCW LiDAR has over cameras. Voyant’s LiDAR can see where cameras cannot.
Our devices use FMCW LiDAR at 1550 nm. FMCW has advantages over time-of-flight or pulsed systems but is not widely used. FMCW is more complicated, requires more parts, just as an FM radio requires more parts than an AM radio, which increases cost.
FM radio works better. When you can pack transistors onto a single device the cost of individual components is no longer a factor.
A similar argument applies to FMCW LiDAR.
FMCW provides immediate, point by point doppler velocity with unprecedented accuracy.
What about Tesla’s claim that LiDAR cannot differentiate between a pothole and a plastic bag?
With every pulse our FMCW LiDAR receives reflectance and polarization measurements that let it differentiate pavement from metal, hands from coffee mugs, street signs from rubber tires, and of course a pot hole from a plastic bag. We expect to read painted markings on asphalt, in total darkness, far past where cameras could. FMCW Lidar is like a sensor from Star Trek. It can tell you where something is, how fast it’s moving, and also what it is made of.
Region-of-interest scanning coupled with reflectance measurements could help Tesla differentiate between reflective signs and headlights, and stop automatically cutting off my high beams every time a street sign comes into view on a dark country road. Because that is freaky.
Voyant’s devices are not science fiction. They fit in your hand. They are real. We plan on producing more of them by the end of 2022 than Velodyne has sold across all its products in the last 13 years.
We hope Voyant’s products can change Elon’s mind about LiDAR, but we are not developing these for Tesla. While our specs meet the needs of every automotive company we have spoken with, we see applications beyond automotive.
We are developing our imaging LiDAR systems for the millions of devices that need to perceive their environments. Fifteen years ago when CCD cameras were the state of the art vision sensor few predicted a world of tens of billions of cameras, most of which operate without any direct human control.
At Voyant, we see the same thing happening with LiDAR.
At some level of price and performance LiDAR will become a pervasive sensor built into everything.
Maybe even a Tesla.