At a Glance:
- The development of robust sensory systems at a commercial scale has a long way to go before self-driving, fully automated vehicles can be declared safe for commercial consumption.
- FMCW LiDAR on a chip may provide the sensory capabilities that will propel autonomous vehicles to commercial success, noted Mehdi Asghari, CEO, SiLC Technologies.
- SiLC’s 4D FMCW LiDAR integrates photonic components, such as lasers, waveguides and amplifiers onto a single silicon chip; its relevance extends across verticals.
It’s been a while since we spotted those clunky spinning cylinders perched atop autonomous vehicles cruising highways.
First seen on Google’s prototype vehicles in 2010, those unwieldy mechanical devices were designed for function rather than form and would become heavily reliant on in-built sensors that collected information from mapping information systems, on-board video cameras, radar and laser range finders that could decipher objects as far as two football-field lengths.
Aware back then that it was just the precursor to the miniaturization of self-driving sensor technology, Google noted in a blog post that “our software and sensors do all the work.”
Self-driving innovation has exploded since then, but safety above all else remains the cornerstone of design when it comes to autonomous vehicles. Autonomous navigation is made possible by the complex process of gathering sensor information—integrating data from a sensor network, including camera, radar, ultrasound and LiDAR—along with image recognition software for analyzing the data and making intelligent decisions based on the operating environment.
LiDAR Takes the Lead
The push to add precision to detect what’s in our environment has a long trajectory. Initial recordings of sonar (sound navigation and ranging) can be traced all the way back to 1490, when Leonardo da Vinci used a tube inserted into the water to detect vessels by placing an ear to the tube.
Radar (radio detection and ranging), which is used in advanced driver assistance systems, harks back to the late 1880s when German physicist Heinrich Hertz noticed that electric waves emitted from a transmitter and reflected by a metal surface can be used to detect distant metallic objects.
LiDAR (light detection and ranging) came to the fore after the invention of the laser in the 1960s; it was first used in aerospace applications in the 1970s before being used more extensively in consumer and commercial applications in the 1980s.
Driverless Agenda
Far from being the silver bullet that points the way for autonomous vehicles, the development of robust sensory systems on a commercial scale has a long way to go before self-driving can be declared fully automated (level 5) for commercial consumption. For one, piecing together the decision-making process requires real-time imaging and accuracy that rival the complexity of a human’s perception capability.
Among available options to gather the staggering amount of data needed to make accurate navigation decisions, LiDAR stands out as a sensory system of note for companies working on autonomous vehicles, argued Mehdi Asghari, CEO, SiLC Technologies.
California-based SiLC announced in May that the company had signed a deal with Shenzhen-based AutoX, an AV company that provides robotaxi service on some of China’s public roads. AutoX will use SiLC’s LiDAR-based vision solution alongside other sensors in the car tech company’s proprietary hardware stack.
SiLC’s FMCW LiDAR transceiver, known as the Eyeonic Vision Sensor, is a silicon photonic chip that integrates LiDAR functionality into a single, tiny chip the size of a fingernail. “FMCW (frequency-modulated continuous-wave) LiDAR is basically a technology that enables you to directly measure depth and velocity of objects by measuring the Doppler frequency shift of the light that reflects from them,” said Asghari.
For more on SiLC Technologies’ long-range FMCW LiDAR transceivers, be sure to check out Machine Design's video interview with Mehdi Asghari.
“Compare it to the sound effect of an ambulance going by,” explained Asghari. “You hear the frequency of the sound is a little bit higher in frequency when it comes towards you and a bit lower when it goes away from you. The pitch basically changes.
"That also happens to be the case for light waves. When objects come toward you and when light reflects off them, it has a certain frequency. When objects go away from you, the light frequency shifts. You can use that to measure the frequency, velocity and motion of objects. You can apply the same [technique] to measure distance.”
LiDAR on a Chip
The 4D chip-scale FMCW LiDAR integrates photonic components, such as lasers, waveguides and amplifiers, onto a single silicon chip. The vision system can achieve micrometer-level precision and can see objects up to 500 m away under the right circumstances.
In addition, Asghari explained that Eyeonic uses coherent optics to provide polarization intensity data. “Light travels in two polarizations—transverse electric and transverse magnetic—and contains content that some of some of our customers may care to know about,” he said. “We provide that information to them and we are the only company that can resolve these two polarization components.”
Although FMCW vision capability using Doppler shifts can provide unique sensory information, Asghari maintains that hurdles remain for many who attempt to design technology around it. That’s because developing a “true” vision system that is independent of lighting; does not interfere with other sensors; can measure velocity and depth; and can provide superior vision while cutting back on the cost of computing power quite often can end up with hundreds of thousands of dollars’ worth of hardware, pointed out Asghari.
“Companies that are trying to make this now are trying to source as cost-effective components as they can afford, and they end up with a shoebox size at best that simply doesn’t scale,” he said. “It’s not going to fit inside your smartphone. It’s not going to fit inside your AR/VR glasses. It’s not going into smart devices that you use at home.”
Last year, SiLC raised $17 million in a venture funding round to support the launch of its LiDAR imaging chip. The company designs its vision systems on a vapor fab process, which is used for making smartphone chips. “We have an amazing level of complexity in these chips that can perform the complex stacks we need them to do, and we can make them for extremely cost-effective prices that enable them to become ubiquitous,” said Asghari.
SiLC joins an exclusive club that already has traction in the automotive space; Luminar Technologies, Ouster, Velodyne, Aeva and Innoviz are among those firms. As a startup, SiLC is carving out an integrated solution and business model designed at once to reduce size, cost and power consumption.
“Our business model is one of a component subsystem supplier,” said Ralf J. Muenster, vice president, Business Development and Marketing at SiLC. “Similar to the Xilinx and Texas Instruments collaboration, people want to buy these components, put them in their system, make the system their own and put their software around it.”
Being part of a photonics manufacturing ecosystem means the business model remains resource intensive, admitted Muenster, who spent 12 years as director for the CTO office of Texas Instruments before joining SiLC. “We need to work closely with customers to design in our components, and in many cases actually teach them how to do it. We have full reference designs, we’ve written the full software, and so that takes time.”
Even so, there is a long runway for LiDAR sensor firms that are poised to succeed. “The next cycle is going to be the Age of Autonomy, where we can get rid of mundane tasks like driving a car for hours on Highway 5 down to Los Angeles. That’s no fun,” said Muenster. “With the current inflation concerns and supply chain issues, should we have humans that either drive lorries or flip burgers or do other things that we just don’t have the human resources for?
"And I think this is going to drive innovation in the next 20 to 40 years…It’s just starting in manufacturing. You see 3D sensing growth going to 1 billion units by 2027, 2D sensors already growing to 10 billion units a year. And we think 4D sensing will be on the same trajectory.”
No Shortage of Applications
The uptake of 4D LiDAR vision sensing technology for solving manufacturing problems is sprouting everywhere in the robotics, industrial automation and mobility space (ADAS, AVs, robotic delivery, robotic trucks, drones, etc.). “There’s a huge pull from companies that are extremely fast growing and have very large market caps and try to solve manufacturing challenges,” said Muenster. “They have a conveyor belt at Amazon, where they’re trying to detect all kinds of different materials and the current sensors are just not good enough.
“Or, [manufacturers] want to go down to 25 micron precision and figure out how to measure that,” he continued. “Yet others want interference-free operation down to millimeter level, which existing time of flight solutions can’t give you. Basically, in factory environments they have limited [sight] and limited range to centimeter level precision. There are a lot of applications in manufacturing that require just millimeter level precision—and that’s something that comes out of the sensor.”
Muenster’s ready list of relevant FMCW LiDAR sensing examples include healthcare (monitoring motion without invading privacy), lifestyle (casino security cameras can detect unruly behavior) and sport (analyzing movement to get an edge on the competition).
For both Muenster and Asghari, automotive provides a “golden opportunity” in the long term, but the “big prize” is the consumer market. As an example, Muenster pointed to Apple’s LiDAR sensing facility on its iPhone 12 and 13 range of devices, which is a time-of-flight LiDAR that has a range of about five meters and an accuracy of one centimeter.
“If you try to scan your house or anything with it, it’s fairly inaccurate,” Muenster said. “What you really want is millimeter-level precision, and we can provide a wider range. Ultimately, there’s no doubt in my mind that this technology will also go into mobile phones and smart glasses.”
Should Asghari’s claim hold true, SiLC can for the moment bet on being the only company that “can miniaturize such a complex device in such a tiny form factor, and therefore fit it inside a smart glass or inside the cell phone, or inside the smart laptop or tablet.”
Judging by the explosion of use-cases across verticals, however, the startup will in the near future also bet on facing formidable challengers.