Semiconductors Product of the Year
Product Name: ADTF3175 – One Megapixel Time-of-Flight Module
Company: Analog Devices
Enabling a New Era in 3D Machine Vision Systems
The ADTF3175 is the first high-resolution, industrial quality, indirect Time-of-Flight (ToF) module for 3D depth sensing and vision systems. The module delivers a turnkey, scalable system that provides high-accuracy and robustness over variable environmental conditions. Moreover, the module abstracts the burden of optical and electromechanical system design and delivers a fully engineered and calibrated depth system to designers allowing them to concentrate on bringing new 3D sensing and vision systems to the market.
Time-of-Flight technology is complex to implement for typical customers. It requires capabilities in optical, mechanical, and high-speed electrical design, combined with sophisticated system engineering expertise and supply-chain breadth around chip-on-board, optics, etc. The result of this complexity has limited the access to this groundbreaking technology. Current high quality depth cameras that are custom engineered tend to be expensive. The goal of the ADTF3175 is to democratize the access to high-performance (ToF) to the broadest base of users while bringing to bear Analog Devices’ expertise and understanding of the industrial market with respect to technical challenges, quality, and product lifetime.
Based on the ADSD3100, a 1 Megapixel CMOS indirect Time-of-Flight (iToF) imager, the ADTF3175 also integrates an infrared illumination source with optics, laser and laser driver and a receive path with a lens and an optical band-pass filter. The module also includes a flash memory for calibration and firmware storage and power regulators to generate local supply voltages.
The ADTF3175 measures 42mm × 31mm × 15.1mm, and is specified over an operating temperature range of-20°C to 65°C. The ADTF3175 delivers industry-leading accuracy of ±3mm across a depth range of 0.4m – 4m (15% reflectivity) with a maximum depth noise standard deviation of 15mm, assuming less than 5 klux of equivalent sunlight. The ADTF3175 transmits raw data to the host over a 4-lane mobile industry processor interface (MIPI), Camera Serial Interface 2 (CSI-2) interface. The module programming and operation are controlled through 4-wire SPI or I2C serial interfaces.
The module comes pre-calibrated with 4 modes of operation (short- and long-range, 512×512 binned resolution and 1024×1024 native resolution). All the necessary calibration data is stored in the system flash memory and can be accessed by the depth processing engine. The module is also temperature compensated and comes programmed with the necessary correction model. Camera intrinsics and distortion parameters are also available from the flash for depth processing and fusion applications. Users can cycle between these modes on a depth frame-to-frame basis and enable complimentary capabilities at the algorithm/fusion level.
An integral part of the toolset offered by ADI is our proprietary software depth computation engine that users can leverage alongside the module. This engine brings with it decades worth of IP and know-how on ToF. It is a key piece of technology in the signal chain that allows the ADTF3175 to achieve its accuracy and performance. While today it is a software algorithm, in the very near future users will also be able to access the same capability through a dedicated image signal processor that will be a step change in latency and power consumption and open up new horizons for depth sensing.
The ADTF3175 builds on the collaboration between Analog Devices and Microsoft to deliver the highest resolution 3D depth sensing module to the broad market. Designers can experience the benefits of the ADTF3175 by using our evaluation kit that effectively turns the ADTF3175 into a USB camera. It mates the ADTF3175 to an NXP iMX8M SOM. This evaluation kit serves a dual purpose of being a complete reference design to interface the ToF module to the NXP application processor, power management, and includes all of the necessary drivers and software that is needed. Beyond NXP, ADI has also worked with a number of other suppliers to develop support of the ADTF3175 within their ecosystems (e.g. Qualcomm, ARM, OpenCL, NVIDIA). Information to achieve Class I eye safety certification is also provided. The module monitors several internal parameters to ensure that safety limits are not exceeded.
The ADTF3175 is built around the ADSD3100 which is the first megapixel time-of-flight sensor on the market. Beyond the high resolution, and smallest available pixel size of 3.5 μm x 3.5 μm, the sensor can operate with modulation frequencies up to 320 MHz, which is much higher than any other sensor on the market or even compared with published work. Operating at a high modulation frequency reduces phase errors. The exact modulation scheme used is critical IP that has been developed over many years and allows the ADTF3175 to provide the lowest noise while simultaneously achieving the lowest latency possible. Latency is a critical aspect in many applications like AR and robotics.
The applications for depth sensing and time-of-flight are many and growing at an exponential pace. As machines transition to autonomous modes of operation where they take on more and more of the decision-making process, they also require the same ability to perceive in three dimensions. Today, depth is accomplished using triangulation methods (i.e. stereo 2D cameras or structured light) but this has a number of limitations as compared with accessing true depth measurements. Autonomous mobile robots (AMR) require the ability to see all around them, and to navigate safely in environments that will contain obstacles, and that will have a wide range of illumination conditions. Collaborative robots (cobots) must work alongside humans, and safety is paramount; safety that can only come with accurate, low-latency imaging of the shared space.
In addition to machine perception, we are now at the inflection point of the ‘Metaverse’ economy which is likely to impact everything from gaming and entertainment, to retail, healthcare, education, and industry. Depth sensing is the gateway to the Metaverse anytime physical objects or spaces need to be virtualized to implement the required experience.