The new system, developed by researchers at the University of California, Berkeley, can remotely sense objects across distances as long as 30 feet, 10 times farther than what could be done with comparable current low-power laser systems.
With further development, the technology could be used to make smaller, cheaper 3-D imaging systems that offer exceptional range for potential use in self-driving cars, smartphones and interactive video games like Microsoft's Kinect, all without the need for big, bulky boxes of electronics or optics.
UC Berkeley's Behnam Behroozpour, who will present the team's work at CLEO: 2014, being held June 8-13 in San Jose, California, USA, said while meter-level operating distance is adequate for many traditional metrology instruments, the sweet spot for emerging consumer and robotics applications is around 10 meters or just over 30 feet.
The new system relies on LIDAR ("light radar"), a 3-D imaging technology that uses light to provide feedback about the world around it. LIDAR systems of this type emit laser light that hits an object, and then can tell how far away that object is by measuring changes in the light frequency that is reflected back.
It can be used to help self-driving cars avoid obstacles halfway down the street, or to help video games tell when you are jumping, pumping your fists or swinging a "racket" at an imaginary tennis ball across an imaginary court.
In their new system, the team used a type of LIDAR called frequency-modulated continuous-wave (FMCW) LIDAR, which they felt would ensure their imager had good resolution with lower power consumption, Behroozpour says.
This type of system emits "frequency-chirped" laser light (that is, whose frequency is either increasing or decreasing) on an object and then measures changes in the light frequency that is reflected back. (ANI)