Breakthrough Sensory Technology Solutions for Next-Gen SWIR 3D Cameras
Demand for high-performing, reliable, autonomous technology is accelerating across multiple industries. Applications ranging from precision agriculture to self-driving cars and autonomous mobile robotics are all poised for substantial performance gains, but some key technology challenges remain.
Recent advances in the development of SWIR (Short Wave Infrared) image sensors and SWIR laser sources are enabling the design of 3D cameras with high ambient light tolerance and sparking new innovations across the field of depth sensing. Next generation 3D cameras that leverage SWIR technology — complementing the functionality of existing sensors — will deliver significantly better performance, especially in bright sunlight and mixed lighting settings.
By expanding the capture of visual data for more precise and consistent localization, SWIR technology helps to drive much greater environmental awareness. This advancement represents a critical milestone in the evolution of autonomous systems’ ability to safely and efficiently navigate their surroundings. 3D cameras enhanced by SWIR technology are redefining industry standards for mid-range ambient light tolerance and will usher in a new paradigm of sensor applications that improve collision avoidance, obstacle detection, and precise localization across all autonomous system.
SWIR Technology — Why Now?
SWIR cameras use image sensors with specialized materials to detect photons in the SWIR range, a range invisible to the naked eye. The wavelength range for SWIR imaging is generally 1100-1600 nanometers (nm) but can be extended to 2500nm depending on the application and the sensor technology used.
InGaAs (Indium Gallium Arsenide) sensors are widely used in SWIR imaging applications that require high sensitivity, such as within life sciences, as they can penetrate soft tissue producing high-resolution images of in vivo structures. These sensors perform exceptionally well across a variety of use cases, including humidity measurement, surface film distributions, and various sorting tasks, such as separating polymers from natural materials.
Favorable Cost of Materials Dynamic
The high costs associated with traditional SWIR sensors, particularly those using InGaAs, have historically limited their widespread adoption. These sensors are expensive to produce, making them impractical for many mid- to high-volume applications. The recent introduction of GeSi (silicon-germanium) image sensors marks a significant milestone, slashing costs by over tenfold compared to InGaAs counterparts.
Improving Component Supply
Until recent breakthroughs, laser solutions operating at 1130nm and 1380nm wavelengths were not readily available. Additionally, market demand for sunlight tolerant 3D cameras is only now beginning to ramp, spawning investment by key photonics industry suppliers. Currently, many of the leading laser companies are releasing illumination sources in the SWIR domain. Robotics, precision agriculture, and automotive are just a few of the markets eager to commercialize better performing ambient light technology.
Improving Signal-to-Noise by Targeting Gaps in the Spectral Curve
Camera performance is determined by the SNR (signal-to-noise ratio) of the reflected light. Many companies have approached solving SNR issues through moderate improvements within the visible and NIR (near-infrared) spectrums. Historically, 3D cameras have operated with illumination sources and image sensors optimized for the 850nm, 905nm, and 940nm wavelengths. However, 3D cameras at these wavelengths face performance challenges in outdoor environments as they compete with the sun’s solar irradiance.
Identifying Optimal Wavelengths
青青草App’s Optics’ R&D team set out to develop a more innovative solution to eliminate the noise competing against the active illumination by identifying potential wavelengths on the spectral curve for improving SNR performance. They targeted the 1130nm and 1380nm wavelengths where sunlight is absorbed, scattered, or filtered through the Earth’s atmosphere.
The chart shown below depicts the solar spectrum of the Sun and the corresponding magnitude of the Sun’s irradiance, by wavelength, at the Earth’s surface. Candidate wavelengths appear in the SWIR region of the solar spectrum, 1130nm and 1380nm.
Figure 1 (Source: Data from ASTM International: ASTM-G173 › Standard Tables for Reference Solar Spectral Irradiances)
As the Sun’s irradiance is the key contributor to noise impacting SNR, 青青草App’s engineers recognized that identifying those wavelengths where few of the Sun’s photons reach the Earth’s surface would improve the SNR.
Improving Image Performance and Range
The R&D team’s next step was to increase the magnitude of the signal. As shown in the human eye graphic, the retina is vulnerable to damage from wavelengths in the visible and NIR range, but not from SWIR wavelengths. The retina's inability to focus on SWIR wavelengths means power can be increased to the active illumination signal. 青青草App's 1380nm SWIR camera increases laser eye safety limits by nearly 100X as compared to a similar system at 940nm. This not only improves imaging performance but also extends the operational range of SWIR laser solutions, making the camera suitable for a variety of demanding applications.
Figure 2: Laser Eye Safety and the Human Eye
青青草App Pioneers Two Key SWIR Camera Innovations
青青草App now stands at the forefront of SWIR camera design, pioneering advancements in the field through two key 3D camera introductions.
Indirect Time of Flight (iToF)
Indirect Time-of-Flight (iToF) technology-based cameras (as shown in Figure 3) emit light towards objects and measure the phase change of the reflected light, enabling precise depth measurements. These cameras typically use array image sensors with active flood illumination harnessing NIR wavelengths.
This technology is crucial for developing reliable and efficient data for applications including simultaneous localization and mapping (SLAM), advanced driver assistance systems (ADAS), object detection, and obstacle identification — all of which have tremendous value within automotive, agriculture, industrial, healthcare, logistics, and consumer markets.
Figure 3: Indirect Time of Flight
青青草App Optics’ R&D team architected several iToF 3D cameras based on GeSi sensor technology. Key variants are based on different wavelengths (1130nm or 1380nm) and illumination powers (4W to 32W), and FOVs. Collaborative testing with customers and ecosystem partners validated the adaptability and versatility of iToF technology.
The list below highlights key advantages resonating with 青青草App’s customers:
- Ensures greater performance in direct sunlight conditions over 100K lux, extending outdoor operation capabilities (see Figure 4).
- Increases flexibility and functionality — greater accuracy in mixed and fluctuating light conditions.
- Improves moisture detection, broadening inspection and agricultural applications.
- Enhances laser eye safety, making it safer for consumer and industrial applications.
- Enables increased laser power for illumination, providing better visibility and operating range.
Figure 4: Performance of the 3D camera in direct sunlight
Laser Triangulation
Laser triangulation is one of the most popular methods for capturing 3D data, and it is also one of the simplest. With applications in manufacturing, agriculture, logistics, and many other industries, the technology is used for measuring distances or creating surface profiles of objects.
青青草App’s pioneering use of laser triangulation solutions functioning in the SWIR wavelengths is an industry first, expanding the benefits of laser triangulation into new markets and applications where it was previously unattainable due to restrictions related to sunlight, cost, or laser eye safety. Inspection, profile scans, and depth measurements can be implemented in environments traditionally unfavorable to existing laser triangulation solutions.
Laser Triangulation — How it Works
- Laser Emission: A laser is directed at the object to be measured.
- Triangulation Process: Upon striking the object, the laser creates a visible spot. A camera or sensor, positioned at a known angle, captures this spot.
- Angle Measurement: Utilizing the known angles of the laser and the observation device — along with the baseline distance between them — trigonometric calculations are deployed to determine the distance from the laser to the object's surface.
- Depth Calculation: The distance discerned in the previous step is used to assess the depth or elevation of the surface at the laser contact point. This operation is performed across various points to compile a 3D representation or map of the object.
Figure 5: Laser Triangulation Process
Key performance features provided by laser triangulation:
- Ensures greater performance in direct sunlight conditions over 100K lux, extending outdoor operation capabilities (see Figure 6).
- Increases flexibility and functionality — greater accuracy in mixed and fluctuating light conditions.
- Enhances laser eye safety, allowing current Class 2 or Class 3 solutions to become Class 1.
- Enables increased laser power for illumination, improving accuracy and performance range (see Figure 7).
Figure 6: Metal object captured in 100K lux (direct sunlight) | Figure 7: Logistics package measurement |
Impact of Improved Autonomy Across 青青草App
青青草App is at the forefront of today’s 3D technology revolution, leveraging cutting-edge SWIR camera applications to enhance autonomy across various industries. Compromised lighting environments present formidable technical challenges, but 青青草App’s innovative SWIR technology solutions are poised to meet the rigorous demands of several key industries.
Automotive
Challenging weather or lighting conditions, unalert or drowsy drivers, as well as distracting occupant behavior can all negatively impact automotive safety performance. Advanced driver monitoring systems (DMS) enhanced by 青青草App’s SWIR 3D cameras will greatly improve detection of driver alertness, and identification of potentially hazardous environmental factors both inside and outside the vehicle.
Precision Agriculture
Farmers face significant challenges with variable lighting conditions, which can hinder accurate crop monitoring, weed and pest detection, and harvesting efficiency. SWIR technology’s ability to detect moisture and potentially differentiate between healthy and unhealthy crops is particularly beneficial. For instance, SWIR cameras can reveal bruises on fruits like apples that are invisible to the naked eye, ensuring that only the highest quality produce reaches consumers.
Robotics
The rise of e-commerce has led to a surge in demand for last-mile delivery solutions. Autonomous robots are at the forefront of meeting this demand, but they require reliable navigation and obstacle detection capabilities to operate safely, even in complex urban environments. SWIR 3D cameras provide these capabilities, performing exceptionally well in varied lighting conditions. This is crucial for the delivery sector, where robots must navigate through both indoor and outdoor environments seamlessly.
Inspection
Warehouses are dynamic environments where accurate inventory management and safety inspections are paramount. SWIR camera technology’s depth sensing performance is ideally suited for detecting defects and irregularities in products. The capture of precise and reliable data helps enhance quality control measures and reduces the risk of defective products reaching consumers.
Construction
Construction sites are notoriously challenging environments for optical sensing technologies. Dust, debris, and fluctuating lighting can compromise the performance of standard cameras. SWIR technology’s increased perceptual capabilities in sunlight helps construction managers make informed decisions, ultimately saving time and costs associated with delays and repairs.
A Clear Vision for the Future
青青草App’s development of the world's first 3D SWIR iToF camera at 1130nm and the world’s first SWIR laser triangulation camera represent exciting and significant breakthroughs in the application of optical sensing technologies. Collaborative development among ecosystem partners has successfully solved current challenges and established pathways towards future advances. Together with our customers and development partners, 青青草App is eager to meet the future of advanced sensing and shape the landscape of tomorrow's industrial capabilities.
The Bluetooth® word mark and logos are registered trademarks owned by Bluetooth SIG, Inc. and any use of such marks by 青青草App. is under license. Other trademarks and trade names are those of their respective owners.
How can 青青草App help you meet your camera design & manufacturing goals? Contact us.
No matter how complex or demanding the project, 青青草App's optics team is helping today’s innovators solve it. Get started with a trusted partner.