Photographer complains that lidar destroyed his system camera at CES

‘Photographers, enter the CES floor at your own risk’ is perhaps a text that should have been on a sign at some stands. A CES visitor complains that an AEye lidar sensor, which was on display at the CES, has broken his Sony A7R II system camera.

Robot developer and entrepreneur Jit Ray Chowdhury took photos at the CES fair of a car fitted with a lidar from the start-up AEye on the top. He discovers that every photo he took had two bright purple spots, from which horizontal and vertical lines run through the photos. He tells Ars Technica that it concerns burn marks in the camera sensor.

In a response, the CEO of AEye tells the website that his lidars can indeed cause damage to camera sensors, although he states that there is no danger to human eyes. The director states that cameras are up to a thousand times more sensitive to lasers than the human eye. According to him, the pulses from a lidar can sometimes cause damage on the film plane with cameras. The director says he is willing to work on technologies to mitigate this and says this is an issue that the entire lidar and laser community needs to work on.

Another lidar startup, Ouster, says their lidars cannot damage cameras. Their lasers have a wavelength of 850nm, although pulses with this relatively short wavelength can be harmful to the human retina. Lidars that emit laser pulses with wavelengths of 850 or 905nm have the advantage of being cheap to make, based on silicon.

However, the company AEye uses lasers with a wavelength of 1550nm. This is more expensive to make, because it does not require silicon, but more expensive materials such as indium gallium arsenide. The advantage, however, is that the fluid in the human eye is opaque to light of this wavelength, preventing the pulses from reaching the retina. Cameras lack this protective capacity. This means that the lasers can operate at a much higher power without putting people at risk. AEye states that their lidar has a range of up to 1000 meters, while most lidar makers cite a maximum range of 200 to 300 meters.

AEye’s lidar is known as a system that measures the distance to an object by sending out a short pulse and measuring how long it takes for the beam to return. However, there are also cwfm lidars that use a different method, in which a laser beam is continuously emitted with an ever-slightly varying frequency. The distance is measured by mapping how much the frequency of the laser beam has changed between the moment of transmission and after the reflection. A director of Blackmore, a competitor of AEye that uses cwfm lidars, argues that AEye’s method has a greater chance of damaging cameras.

Lidar sensors can determine the distance to objects or surfaces by deploying many, fast laser pulses. In fact, it works like a radar, collecting the reflected material. Lidars are important for the development of self-driving cars.

Comments
Loading...